8
2 Comments

Basic SEO - a guide for indiehackers

I thought I'd spew any search engine optimisation knowledge that I know of onto a page, and it ended up being more than I thought, hopefully this helps some of you with the basics and there are a few tidbits of useful information in here that you weren't already awareof.

Keywords

Deciding on what keywords to use for an article or page on your website typically is a balancing act. Too wide and varied and you’ll have no traffic, too narrow and focused and you’ll have too much competition to be able to rank your content - at least until you start attracting links. One of the best ways to determine what keywords to chase, is to actually run Google Ads. Once you can see the volume of traffic and the clickthrough that a paid for advert gets you can apply that strategy to organic traffic i.e. SEO. This can be particularly effective if your product dynamically generates meta descriptions and you are wondering what keywords to use. Other things to do for ideas is to look at Yoasts Google suggest expander.

As Google uses what people type to power this, you can get a feel for what would be good long tail key phrases to use within your article or page. The Google mobile app also showcases what topics are trending for that day as an autosuggest, if you are interested in catching a wave of traffic - that is another source of info of what is trending on the web and what can be worth writing about.

Meta Descriptions

Everyone's favourite (and probably most misunderstood) on-page HTML element, meta descriptions have their place in a webmasters kit bag to increase the number of clicks you receive on Google. Yes that’s right, they don’t play any part in the ranking of your website. Repeat after me. Meta descriptions don’t help ranking, Meta descriptions increase click throughs.

You should take care to craft unique meta descriptions for each page, and if you can (in natural language) use the keywords you would like to target for that page – not for a ranking boost, but merely because they will be bolded in the meta description if found. This in itself, increases the clickthrough to your website.

The importance of using unique meta descriptions, simply put is this. They can be an indicator to Google that the page containing them is also unique. Unique content = more pages in Google's index = more traffic.

Meta Descriptions should also follow the same guidelines that Google give you when creating Google Ads e.g. - use Title Case To Increase The Clickthrough, and don't exceed 160 characters.

Semantic HTML

There is something of a misconception out there in relation to HTML structure and Google. What you will want to do is to structure your page in a natural way with elements of increased importance at the top and the HTML structured accordingly. E.g. use a H1 for the first most important keyword that you are trying to rank for, and then use heading tags H2 - H5 for other similar phrases and keywords that are complementary to the original phrase. Basically as Google tries to figure out what a page is about, they are relying on the structure of your HTML to help them out, headings are a big plus. In terms of making sure your HTML is valid - again people will tell you Semantic HTML helps you rank higher on Google (hint. It doesn’t).

Googlebot is now sophisticated enough for even the worst HTML coding to be parsable - you can imagine how much shitty code there is on the web, and what impact it would have on their business if they hadn’t already figured this problem out. Semantics therefore are only necessary to help provide additional structure on the page to help figure out the content and what its all about.

Title Tags

Titles are probably one of the most important tags in your HTML that market your website. You should treat them as such, and as with heading tags, you need to balance interesting titles with titles that contain the keywords which describe the content on that particular page.

Many brands miss a trick by having their company name in every page title on their website, when in actual fact the only page that arguably needs it, is the home page. The reason you should leave it off elsewhere? If you have a strong and unique brandname, your site will get found anyway, from a natural search.

Unique phrases (such as brandnames) generally rank at number one naturally anyway. Putting a company name in every page only dilutes the other keywords Google has to crunch on. If your every blog post has “Website Name” on the end of the title, you are probably leaving search traffic on the table.

That said, some larger companies add their company name to every page so that their customers that recognise them on internal products / pages will click through to them, irrespective of what position in the search results they are at. Chances are, your brand as an indie hacker hasn’t quite reached those proportions yet.

Links

Links are still the lifeblood of a healthy SEO strategy. Without going into too much depth, Larry and Sergey built Google when they realised that the more citations a scientific paper had, the more likely it was to be an authoritative source on the topic in question. Google was therefore built around the same premise, that links are a citation, and the more of them that you have the more powerful your site and the higher you should rank. All links are not created equal however.

The link text that is used both internally (on your own website) and on third party websites to link back to you if possible should contain your key phrase if at all possible, and your link profile should be natural so as not to trip any spam triggers.

For example, it’s worthwhile having a blog post on a third party website pointing back to your home page with the phrase “green widgets” if that is what you are trying to rank for. However, if Google sees that you are deliberately trying to manipulate your link profile - say by adding 100 links pointing back to your website with just the phrase “green widgets” then it’s likely you could pick up a penalty and tank in the results.

Slow steady natural acquisition of links by concentrating on great content and promotion of that content is the easiest way to acquire links pointing back to your business - but it’s a long slog. It’s also worth noting that any link tagged with rel=”nofollow” will not pass link equity back to your site so be aware of this when promoting your site.

Other ways of acquiring links include exploring your competitors link profile to see where they have picked up links. There are commercial tools out there for exploring this sort of thing, two of the more well known ones are Open Site Explorer from Moz. https://moz.com/link-explorer , SEM Rush - https://www.semrush.com/ and ahrefs - https://ahrefs.com/ .

You can then use this backlink data to submit your website to the same places - it’s an easy way of seeing where they might be getting traction. Spyfu - https://www.spyfu.com/ is also an interesting tool that lets you see the ads your competitors are running. You can then shape either your own ads to chase the same keywords, or change your organic strategy to chase those keywords.

Linkbait / Product Bait

Similar to the above - having a good link bait strategy is important. It’s not enough to just create a blog post every week and hope traffic will magically end up on the page. The best marketing comes when you create a piece of content that takes weeks to pull off and has some sort of viral loop built into it. That means concentrating on what content you hope to knock off the top perch on Google for a particular key phrase - researching it deeply and writing a significant content piece on it. Aim for an article somewhere in the range of 3000+ words to truly be considered “deep” - then make sure that you’ve got some form of sharing built into the article body. You might consider offering people who share the article some additional incentive to do so. Whether that is a free download or whatever - the more shares you get, the more likely you are to end up on the radar of people who curate content on the web, and that will mean links indirectly.

Indie Makers are also at an advantage in that they can create tool bait, or mini products to attract an audience, and you get the bonus traffic of something like product hunt launches that can feed back into your page. There’s an example of Jon Yongfook doing this recently for Bannerbear, and it’s an idea popularised by Gabriel Weingberg in the book Traction. Here are a couple of others recently mentioned on a thread on here on Indie hackers.

Open Graph Data

With social media playing a significant role in sending you traffic, you will want to maximise the click throughs across Facebook, Twitter, Whatsapp etc. Open Graph tags not only allow you to customise the message for each platform - e.g. You can write a specific tag for Twitter, and a specific one for Facebook to increase engagement, but they also allow you to create a thumbnail image to accompany the article.

The official documentation for Facebook is over here: https://developers.facebook.com/docs/sharing/webmasters/

Twitter’s Open graph documentation is here:

https://developer.twitter.com/en/docs/tweets/optimize-with-cards/guides/getting-started

One good tip is to make sure the image size for Facebook in particular is at least 1200 pixels wide - this ensures that the size of your preview advert is maximised. Smaller images result in a smaller preview advert, which isn’t as attractive in the newsfeed. For other platforms, Facebook open graph tags are typically looked for before Twitter Card markup, so if you are making an engineering decision on which one to implement, make it the Facebook one. It’s becoming something of an unwritten standard. You can preview how your site will look when shared using the open graph debugger here: https://developers.facebook.com/tools/debug/ - this also lets you clear any cache if you update your tags and want a new image to pull through.

Webmaster Tools

Webmaster tools is a way for you to check a number of search engine related factors. You can check indexing status (how many of your pages have made their way into the index), backlinks, (places that search engines have found links to your website), robots, (to test your robots.txt file), Keyword searches (including positions and clickthrough data) and any errors that Googlebot finds with your content.

This additional information is available within Google and Bing, and you might as well setup accounts with both of these, as you may find information that correlates between the two. Google Webmaster Tools setup firstly requires a Google account. Once you’ve setup and logged in, you will have to verify your site with a meta tag or upload a file that proves you are the owner of the account.

It’s well worth doing this for both search engines at the start of a project, as robots.txt, and sitemaps are verified through this tool. Google also regularly email you when they find any issues that are needing addressed.

Robots

Sometimes, it makes sense to tell Google what parts of your website aren’t really worth offering to visitors via their search engine. Typically, this would include things that are sensitive (private documents), or things such as login pages which you don’t want naughty robots sniffing around. Even if you want your entire site indexed, it's still worthwhile to add one to prevent 404′s from showing up in your raw server logs (as robots request this file).

Webmaster tools for both Google and Bing include testing tools for robots to make sure everything is kosher, so assuming you’ve set this up, there’s really no excuses for not getting this right.

Sitemaps

Sitemaps not only give crawlers a comprehensive list of URLs to check frequently, they also result in faster indexing of your site, meaning new content gets to the search engines quicker.

If you think about it, it makes sense to provide an easier, more structured way for Google and others to find your new content, rather than parsing through tag soup HTML, (even though they are pretty darn good at that by now). The faster search engines are, the more profitable they are as they catch waves of temporal traffic and save money on the processing of the information you provide to them.

Sitemaps can (and should) be specified in your robots.txt file, and directly in Webmaster tools.

Images

Every image on your site should be optimised for maximum search benefits with alt tags and title tags if possible, and if your site is particularly image rich, it can be difficult to provide relevance to Google.

Pinterest use machine learning to identify images and provide the necessary alt tags etc. to growth hack Google image search. Here’s a post claiming they scrape the search engines to perform this, which I’m taking with a pinch of salt. It’s much more likely that they have built a ML model and this reflects what the model finds elsewhere on the web. If you are technical enough arguably you could do the same, or use an API to help provide richer data to Google if your startup is heavily image focused.

One other extra tip is to add images inside the sitemap protocol. Further information on how to go about that at the official Google Webmaster blog. .

Also worth pointing out that high impact photos used on your site will greatly enhance the chances of your content being shared, so take time and attention to polish your content before publishing.

URL’s

Your website URL’s should if at all possible be rewritten to include keywords that underline the main focus of the page. You have probably come across links on the web that look like this:

http://www.domain.com/?p=72&s=0 -

Well, thing is, search engines aren’t really able to work out what is going on with those, they don’t highlight what exactly p=72 means. If on the other hand, your website looks like this:

http://www.domain.com/fruit/apples-and-oranges

Google has a pretty good idea what the focus of that page will be, prior to even parsing the page. Take this advice and apply sparingly – if you already have a structure on the web that looks like the former, it may not be worth the effort to create friendly URLs, as existing links out there will break!

Some people would instinctively redirect any broken URL’s that occur, but remember, 301 redirects do not carry all the juice, so be prepared to wait a while for you to rebuild your authority, which will probably be worth the effort in the long run. If you are starting out a project from scratch, take the time, and the effort to consult with developers to ensure they are thinking in this way, and following best practise from the get-go. Something else I see people regularly doing is to setup a subdomain instead of a sub folder for a company blog. Each of these are treated as independent entities by Google so it’s best to start off on a sub folder if you can.

Architecture

Your website architecture is an important consideration when developing a site. You may choose to have a flat architecture, with no directories, or multiple directories. Google suggests that deep directory structures don’t work just as well from an SEO point of view. Many people suggest that having dates in URL’s is a no no, as it indicates when content is old to visitors, and adds unnecessary information to your URLs.

The flip side of this is that it enables much more deep directory diving in Google Analytics! I can, for example work out that content in the month of January has attracted more pageviews than February, or that 2020 was a more successful year than 2018. Google automagically categorises content according to pseudo directories. Just another thing to think about when deciding about your site architecture.

Canonical Tags

If your site has identical URLs which serve the same purpose and deliver the same content, you may be suffering from duplicate content issues. Simply put, this is where search engines can’t work out which content is the most important to show in the results, and it decides on the best content itself. This may not be what you intended, and is commonly the fault of the content management system, or site architecture in question.

Thankfully, Google and other engines have agreed to support a new tag, known as the link canonical which can specify which page is the preferred one. This will result in link juice flowing correctly (if people have linked to the wrong version), and help Google figure out what page you intended to show in the SERPs.

Google Search Raters Guidelines

As part of the Google algorithm, Google occasionally perform manual ratings on web page searches in order to provide a human element to them. This was originally leaked on the web, and subsequently published for everyone to see, it’s worthwhile reading it front to back to get a feel for the sort of things to do and not to do as well as to see what they say about building a quality website.

What have I missed? Happy to add any other seo tips or tricks that anyone wants to add to help the community.

  1. 1

    I've been working as an SEO specialist for about three years now. If only I had this article at hand back then... When I was just getting started with digital marketing, I was actually struggling with all these terms, so if you're on your way to becoming an SEO specialist, or an e-commerce specialist, you'll definitely find the information Knox posted very useful. I would also like to share a website called https://www.leadraftmarketing.com, where you can find out more about digital marketing. You'll learn different methods through which you could get a more significant return on digital spending. I hope this will be helpful for some of you!

  2. 1

    Thank you Paul for such a descriptive post on SEO.
    Very helpful and insightful. :)

Trending on Indie Hackers
How I grew a side project to 100k Unique Visitors in 7 days with 0 audience 49 comments Competing with Product Hunt: a month later 33 comments Why do you hate marketing? 29 comments My Top 20 Free Tools That I Use Everyday as an Indie Hacker 15 comments $15k revenues in <4 months as a solopreneur 14 comments Use Your Product 13 comments