Do you know why site structure is important for SEO?
You probably do, but for the sake of appeasing the SEO gods, let’s just recite some dogma.
Search engines send out spiders, robots, crawlers (pick your favorite descriptor) to explore the internet and tell stories about what they’ve discovered.
They take this information and dump it into a search engine database. This is what you get when you search — stored information, an index, of what’s out there.
This is over-simplified of course, but it’s an important process to understand because your site should make friends with these robots.
That means you make it easy for them to crawl your site, figure it out, and index it.
A good site structure will do that.
It’s Like Dating Both Robots And Humans
- For robots, a poor site structure would be things like bad internal linking, a missing or misconfigured sitemap, unclear navigation, ineffective URL structure, and so on.
- For humans, they’d get lost in the navigation, be unsure of what page they’re on, not know how to find what they’re looking for, and end up leaving.
Site structure is about how your website is crawled and indexed by robots and experienced by humans.
- Robots care about crawling and indexing your site, finding pages, getting a handle on content and semantics, and understanding what your site is about.
- Humans are visually looking through your site, so elements like navigation, menus, and breadcrumbs will make a difference for usability.
We’re going to look at a few site elements and best practice of how you can make sure your website structure is up to par and caters to both robots and humans.
Let’s start with how visitors of all kinds navigate through your site.
For robots and humans, navigation should be easy to follow.
Standard practice is to place the navigation menus at the top and/or left, but this is of course also dependent on the site itself, as long as it is clear.
Avoid menus that are too deep and instead offer options and information quickly, with the fewest clicks needed. You want visitors to quickly identify where they are and how to get to what they’re looking for.
Navigation is also about hierarchy and site structure on a more technical level. This would be elements like pages, categories, sub-categories, tags, and more.
For robots, you want search engines to parse, index, and crawl through your site easily.
That’s where internal linking comes in.
This is a strategy used by connecting your pages relevantly, and thereby improving the functionality, usability, and “crawlability” of your website.
The idea is to reduce the number of clicks required and hoops a visitor has to jump through to to get what they’re looking for.
For robots, you want to strategically use anchor text to create page and content relationships.
Keep in mind that with Penguin and moving forward, Google might penalize you around for over-optimizing things like internal linking, mainly through “keyword stuffing” (using your longtails excessively).
However, don’t lose sleep over this and simply aim for a good experience for your visitors, where these links are relevant, useful, natural, and consistent.
Linking these together (see what I did there?), you should have a sitemap for visitors and robots.
Your website sitemap can come in two forms: one for search engines and one for your visitors.
The visitor sitemap is often a single page that shows all of your pages, subpages, posts, categories, and more. This will make it easier for people to find what they’re looking for.
A sitemap for search engines is made for the purpose of increasing crawlability by telling those spiders and robots about your pages through XML language. It tells them how to crawl your site and acts as a guide of sorts.
And this sitemap leads humans and robots to pages, which in turn need to have a good structure to them as well — that is, easy to understand in one glance.
It’s not just the on-site elements, that’s obviously important, but your code markup matters, too.
Great, now how much of your markup and code is valid? How much of it is done “right”? Have you had your markup checked?
Defective markup will affect crawlability and could lead to poor visitor experience.
Aside from that, you can, to some degree, control how pages are described by some search engines, this mainly happens through meta descriptions, alt tags, titles, and more.
Pages usually link to other pages, so the issue of URLs matters for your site structure, too.
Pretty And Ugly URLs
Depending on your CMS (Content Management System like WordPress as an example), the URLs may be ugly (often dynamic) or pretty (often static).
An ugly URL would be: www.yoursite.com/?p=982#4%
A pretty URL, and far more useful for everyone, would look like this: www.yoursite.com/topic-name
Search engines have no problem with either one, but overall, you want a static (pretty) URL because it can contain your keywords and are more visitor-friendly.
These could also be combined into some type of hybrid, which you’ll often find on news and e-commerce websies (they have a static and dynamically generated URLs).
In general, you want to avoid using long URLs with unnecessary session IDs and parameters — but if you have to (because it’s a large site or multiple ways to generate the same page from various queries), then you should use canonical tags.
This way, even if you may generate several URLs to the same page, only one is tagged for showing up in the SERPs and to receive any SEO value.
If you use sub-categories or sub-directories, you should aim for a simple structure and having names for them that relates to the content.
This is where you can make use of head, middle, and long-tails, such as: Beads > glass beads > venetian glass beads.
In fact, a good site structure should capitalize on keywords and long-tails with its navigation, categories, pages as well as the usual suspects for on-site optimization.
Keep Visitors, Date Robots
If it isn’t clear by now, let’s make it clear: a solid site structure will make your visitors happy and make robots fall in love.
What you do for one will overlap to the other — for example, a good site structure in terms of keywords, categories, sub-categories will lead to a clean and easy navigation that will take your visitors where they (and you want them) to go.
A website that is crawlable for spiders, is indexed accurately, and loads fast will increase rankings, visibility, and click-throughs to not just the homepage, but elsewhere on the site, too.
Do you have other ideas and tips on how to streamline your site structure to receive more SEO love? Share them in the comments, we’d love to hear them!