Architecture & Design9 Steps to SEO Utopia: The Web Developer’s SEO Edition

9 Steps to SEO Utopia: The Web Developer’s SEO Edition

Developer.com content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

By Ricardo Figueiredo

When it comes to search engine optimization (SEO), most developers understand its importance to the success of a website, chalking it up as an after-thought ordinarily taken care of in the post-production phase. But as SEO transcends into elements of design, U/X, and wireframe planning, it’s never been more important for a developer to grasp the most essential elements of SEO as it relates to the pre-production phase of website development.

The truth is that many developers unintentionally skip crucial SEO steps when developing a new design, and once the new site goes live, it becomes much more difficult to implement SEO changes than while the site is in pre-production.

“Pre-production SEO” ensures that while a new design (or website) is in development, it is also going through the proper on-page optimization steps. Some of the most essential SEO elements can be implemented during production, and a developer can bring so much more to the table when SEO foundations are incorporated into their ideas.

Developers can make the most of their efforts and increase the value of their position by following 9 essential pre-production SEO steps.

Here’s how.

Navigation

Website architecture is one of the most important aspects of both usability and SEO functionality. Navigation should consist of properly structured HTML with seamless CSS functions like drop-down, hover or expanding menus using JavaScript. But developers need to be cognizant of design essentials that also impact SEO. While most CMS systems have navigation styles built-in, a developer designing one from scratch needs to always enable JavaScript for CSS menus, create text-based header links, and code all links as visible no matter what kind of browser is used. To double check completed work, links in a cached, text only version of a webpage should show all internal links as well as the main navigation. If missing links are discovered, it will cause problems in the post-production phase.

Permalinks

URL structures — or permalinks — are an SEO staple, aiding search engines in their quest for understanding keywords that are relevant to your website. The most obvious part of any given URL is the domain name, but equally important are subdirectories of the URL which define a site’s depth and width. When creating pages in a site, it’s critical to choose page and post names that are keyword-rich. Of course, a URL structure should never be used as a keyword stuffing mechanism — instead, developers need to be cognizant of how subdirectories are built with client keywords at top-of-mind. Avoid the temptation to categorize using numbers — even though the strategy is clearer in the sense that information can be organized consecutively, numbers don’t provide the necessary information for search engines to understand the website in greater detail. For example, which URL provides the most face value information about the website?

http://www.sampleurl.com/133342/p=435?

http://www.sampleurl.com/cameras/nikon

The second URL is right on target when it comes to creating search engine-friendly structure while also using keyword-rich permalink configuration.

Furthermore, when developers are dealing with a site migration or re-design, it’s important to keep in mind that if the new pages have a different URL structure, then 301 redirects — or a permanent redirect — will be in order. For example, if the old “about” page was located at http://www.sampleurl.com/aboutus.html and the new version of that page will be located at http://www.sampleurl.com/about-us/, the old URL should be 301 redirected to the new URL upon launch of the new site. This will ensure that all old URLs will pass their SEO authority over to the new corresponding locations. If old pages were consolidated and don’t have a corresponding new page in the new site, developers should just 301 redirect them back to the homepage.

Headers, Title & Meta

Headers define the importance of any given webpage to search engines using simple <H1>, <H2>, and <H3> headlines written in simple HTML code. Headers do not affect CSS style elements, and they are easy to implement, so there should be no reason not to add them into the page code. Use headers to define titles and subtitles when uploading content to each page.

Title and Meta tags are equally important when developing the <HEAD> code of your HTML page. Most CMS systems will allow plugins or extensions that allow a developer to easily edit title and meta, but in the instance of hand-coding, a developer must include title tags, and meta tags to include name/value, description, author, robots, copyright, character set, and other relevant attributes. Even though Meta data is standard web procedure, many Meta tags are obsolete these days, such as the keyword Meta tag. To stay on top of developments in SEO, it doesn’t hurt to take a one-day refresher course at any university extension.

Schema

Schema.org is a set of web standards that should be built into code whenever possible. Schemas are HTML tags used to markup webpages to enable and promote its crawlability. The most popular search engines use Schema markup to understand a website in detail — in other words, it provides meaning beyond the keywords themselves. For example, with the use of microdata, a developer can use “itemscope,” “itemprop,” and “itemtype” tags to further define header tags and page content, thereby creating a powerful website in terms of crawlability. There are many microdata tags that can also enhance the webpage’s result listing on the search engines result pages. Whenever you see review stars, or price information on a result listing in Google, that information came from the microdata tags in those webpages.

Internal Hyperlinking

Most SEOs will agree that anchor text plays a significant role in SEO performance. A developer can enhance a website by following keyword/page linking strategies to make the most of on-page optimization. Ordinarily, an SEO will provide detailed instructions on what keywords are linked to which pages. In the event a developer is working independently, s/he can make the most of hyperlinking by creating naturally-occurring hypertext based on page and post names. For example, if a page is named “Avatar Movie Review” the developer can hyperlink the same phrase elsewhere in the site with the link <a href=”http://sampleurl.com/avatar-move-review/”>Avatar Movie Review</a> .

Properly-Coded JavaScript

Websites today offer sophisticated JavaScript applets that enhance the “cool factor” of a site. From toggle buttons to animation, these beautiful web enhancements offer some of the most incredible design elements in web development today. But as attractive as these features may be, they can pose quite a challenge for search engine indexing should the site be overwhelmingly full of JavaScript applications. To test the functionality of your design, disable JavaScript and ensure the content is still readable. Doing so is especially important for toggle buttons that hide or display text under a JavaScript markup.

ALT Tags

It’s so important to name your images and videos appropriately rather than settling for its number counterpart. Because search engines use every piece of data including picture, video, and graphics titles as keywords to understand each page of a website, it’s never been more important for a developer to properly name these website attributes appropriately. An ALT tag refers to the “alternative text” rendered in the event the object (picture, video or graphics) are unable to load. For instance, a page about beta fish should have a picture with the ALT tag: <img src=”http://upload.sampleurl.com/media/betafish.jpg” alt=”A reddish blue male beta fish swimming in a bowl in a pet store.” > The developer should use as much pertinent information as possible to ensure site usability should the picture fail to load while tending to the SEO-friendliness of the site.

Canonicalization

Unfortunately for web developers (and SEOs), multiple URLs of so-called duplicate content is hard to avoid when using any of the popular developer tools like Apache. That’s because default settings for these programs cause canonicalization errors such as http://sampleurl.com and http://sampleurl.com/index.html — two identical web pages that are diverted into what search engines see as two unique URLs, causing a potential host of duplicate content issues. Inbound links are also diluted since the search engines see the URLs as two unique pages, thereby reducing the efficacy of site’s SEO efforts. Canonicalization — or the redirecting of duplicate URLs — offers the solution. The common code used to redirect so-called duplicate pages is <link rel=”canonical” href=”http://www.sampleurl.com/index.html” />

Another option to avoid duplicate content involves the use of a 301 “Moved Permanently” redirect. Developers on Apache will need to access their .htaccess file to create the server-side URL redirect. It’s critical for developers to implement proper canonical URLs to improve the efficacy of the site once it’s live – not to mention saving a host of headaches in the site’s post-production phase.

Sitemap & Robots

A developer should never finish a website without a sitemap. A sitemap is an XML file that lists a website’s URLs along with essential information about each page including its importance, how often it is updated, and how it relates to other pages in the site. Sitemaps are essential to search engines that use its information to learn about a website in greater detail, not to mention the ability to crawl the site with ease. According to Sitemaps.org, a good sitemap must:

  • Begin with <urlset> and end with </urlset> tags.
  • Specify namespace within the <urlset> tag.
  • Include the <url> tag for each page/post within a site.
  • Include the <loc> child entry for each <url> parent tag.

Finally, the Robots.txt is a file located in the root folder of most websites. It is one of the first checkpoints for search engine crawlers used to give instructions to web robots, called the “Robots Exclusion Protocol.”

A developer may use the Robots.txt file to completely block search engine crawlers from their development server while working on a new web project. Once the website is launched, it’s important to check the Robots.txt file again to ensure that it isn’t blocking all crawlers from accessing the website once it goes live.

While developers will use a Robots.txt file to block certain folders from search engines, one line of code is often left out — that’s the line of code that instructs the search engine crawler where to look for the Sitemap.xml file. If the Sitemap.xml file is located in the root folder of the website, as it typically is, it will be found by the search engine crawlers. However, in some instances, those files are located in other sub-folders, and the Sitemap command in the Robots.txt file helps the crawlers find it. It’s important for developers to ensure proper coding of the Robots.txt file so the Sitemap can be easily located.

A typical Robots.txt file may look like this:

User-agent: *
Disallow: /administrator/
Disallow: /installation/
Sitemap: http://www.sampleurl.com/sitemap.xml

There’s no doubt that the world of SEO is vast and ever-changing, so it’s important to stay up-to-date as search engine optimization evolves. Developers should now feel confident in their ability to develop a website that meets design and user experience requirements while creating code that is search engine friendly right out-of-the-box.

Here’s to creating beautiful — and SEO-friendly — sites across the web.

About Ricardo Figueiredo

Ricardo Figueiredo is the co-founder of Elevated Search, a boutique firm of SEO experts based out of San Diego. Ricardo has over 8 years of experience SEO project management specializing in on-page optimization, link building, and local SEO and conversion optimization. Connect with Ricardo on Google+.

This article was contributed

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories