MAJOR UPGRADE: Writing Automatically Adapts to Top-Ranked Pages

New update: automated a bunch of data-driven best-practices so every article we generate gets you more bang for your buck.

June 19, 2024

Main insight from Google API leaks

I got a few takeaways from the new Google leaks update:

  • Backlinks don't matter that much if you're not a big brand. You should still pursue backlinks as a small brand, but the ROI on time is minimal and Wraith Scribe doesn't help with this...yet.
  • The biggest signal for Google to rank your site is when their Chrome user stays on your page for a long time.

This is reasonable and follows the first principles reasoning of: "If your content is relevant, then the user will spend a longer time staring at your website."

The insight here has some nuances, because if you're a dictionary website and you only have a simple definition for a simple word, then the reverse is true -- the more relevant your site is, the less the user has to look around to get the information they want. For the dictionary example though, dictionary.com is a much bigger brand and so they will just rank sans this factor.

And, because your objective (and my objective) for using Wraith is to:

  1. Rank high
  2. Build trust with customers by having them spend more time with our content

We simply won't ROI if we have a very simple definition that they spend 10 seconds on, get what they want, and leave our website (even if we rank high). 10 seconds = no trust is built.

Automated optimal article structures based on best-performing websites

Going off the insight that the primary goal of a blog post is to retain attention, I wanted to improve Wraith's article structure. Most readers scan headings and read what's relevant to them. Thus, if the structure of an article is an afterthought, it's unlikely to be optimized to retain reader attention.

Further, depending on the keyword's intent, the structure of the best-performing articles is different.

Previously, Wraith would generate an article structure based on some research it did on the topic. However, this has a bunch of randomness to it and isn't a very data-driven way of doing it.

The new way:

  1. Look at the keyword, and identify the intent of the keyword (I trained a custom-AI model to understand keyword intent).
  2. Pick a top-performing article structure (I scraped a bunch of rank #1 website across a bunch of different keywords with varying intent to know which ones are top-performers).
  3. Generate the outline based on research + best-performing article structure.

Using rank #1 website's article structures as a way to guide article generation seems to create much more 1) relevant content, that's also 2) much more in-depth. Both are huge wins for reader retention.

Better AI Images

Previously, Wraith generated photorealistic, stock-like images. However, current AI images tech can create very ugly / creepy images sometimes. Ugly images = readers leave your website and kills retention.

A solution for this is just to make the images more cartoon-y. However, it looks too comic-y and there does not seem to be too much variety in the generation of images. The vibe is also wrong for a B2B setting.

So instead of something like this:

old, comic style, less approachable ai image

I've further improved the images to look something like this instead:

The latter's a lot more approachable and easier on the eyes in my opinion.

What about product images?

AI cannot make product images since it has no reference to what new/novel products look like. In the future, we'll have a specialized writing engine that will only look at real products images, relevant to the roundup article, and then inject those into the article. Think: an AI robot that can search for a bunch of Amazon links, embed them, so the product images are accurate, but can also supplement Amazon affiliate links with other, vendor-native website's product photos. (Just stopping at solely a list of Amazon affiliate links is nice, but there is some SEO hypothesis that a lack of link diversity can hurt rankings; at the very least, link diversity won't hurt).

Automating Table of Contents

Wordpress comes with a Table of Contents update. However, as I am setting a new Wordpress site, I realized users might not want to go through the effort of setting up a TOC plugin. More plugins = more to maintain over time.

So now, in an individual-article, as well as a batch-article basis, you can choose if Wraith auto-injects a table of contents to your Wordpress.

Why isn't this on by default?

2 reasons: First, if you already have a Table of Contents (TOC) plugin for your Wordpress, this'll make 2 tables of content, which is silly.

The second and more important reason: The verdict's still out in terms of whether or not a TOC is good for SEO. When building this feature, like with all other features, it is more or less data driven. Having scraped a bunch of websites:

  • Only ~38% of top-ranked pages have a TOC for informational keywords.
  • 64% of top-ranked pages have a TOC for commercial keywords.

You can make one of 3 conclusions with this:

  1. Having a TOC does not affect rankings, since the average across both types of keywords is roughly 50%.
  2. Having a TOC affects rankings: It helps for commercial keywords but hurts for informational keywords.
  3. Having a TOC doesn't affect rankings, but probably doesn't hurt since 38% for informational keywords is not a small percentage at all.

Your choice

If TOC doesn't affect rankings, and you're in the camp that users will spend less time on your website since they can find the subheading they're looking for quicker, and therefore leave your website quicker, hurting your SEO, then you can just leave the "Add Table Of Contents" option off.

If TOC doesn't affect rankings and you think it won't hurt, or if you feel it'll actually help, then you can turn on the "Add Table Of Contents" option.

If you feel TOC helps for some type of keywords, and not other types, then you can leave it on in a batch of keywords, while turning it off for another batch of keywords.

UI Improvement for deleting jobs

Previously, users can delete jobs when they've failed. This was to prevent race conditions and attacks where articles in progress are deleted, which wastes your tokens and also my backend resources.

However, you can also now delete jobs if they haven't started yet. This is because as a user copy/pasting a bunch of keywords to generate a bunch of articles, you might accidentally create a job with an irrelevant keyword, or a wrong keyword. It's nice to be able to remove those from the job queue before you're charged tokens.

Various AI bug fixes

There's too many tiny little AI hallucination bugs to fix here. But here's some I fixed off the top of my head:

  • AI repeating the prompt in the output.
  • AI rephrasing important statistics when humanizing the text, thus making the data false. Inherently, this seems to indicate AI doesn't really understand what it's doing and is simply a word-predictor.
  • AI clumping markdown headings and paragraphs together, making headings extremely long.
  • AI randomly adding H1 headings, inside H2 sections.