Robot Parts Co.

A quick look at optimizing some content for search engines to make the site show up better in search results.

Goal

We’re going to explore a bunch of SEO concepts on a fake website so we can look at the important tags & content and see how to improve the search engine juice.

All these techniques will later be applied to your portfolio website.

Fork & clone

Start the lesson by forking and cloning the robot-parts-co repository.

Fork & clone the “robot-parts-co” repo.

The repository will have some starter files to get you on your way and include requirements for Markbot so you can be sure you’ve completed the lesson.

  1. Fork, clone & Markbot

    This includes some starter code that you can get by forking and cloning the repository. You’ll use Markbot to double check everything is done properly.

1 Set up the project

We’re going to take an mostly complete website and enhance it by adding better SEO and metadata to the code.

You should see the following files in the folder that you’ve cloned:

  1. robot-parts-co
  2. css
  3. main.css
  4. images
  5. logo.svg
  6. products.svg
  7. social.jpg
  8. index.html
  9. page2.html
  10. page3.html
  1. Type it, type it real good

    Remember the purpose of this lesson is to type the code out yourself—build up that muscle memory in your fingers!

  2. Naming conventions

    Don’t forget to follow the naming conventions.

2 Rename some files

One thing that search engines look at when trying to determine the topic and keywords of your page is the URLs, and the words in the URLs.

We should rename some of these files to be better for search engines.

  1. robot-parts-co
  2. css
  3. main.css
  4. images
  5. logo.svg    ➔    robot-parts-co.svg Matches company name
  6. products.svg    ➔    gears-power-cells.svg More descriptive
  7. social.jpg
  8. index.html
  9. page2.html    ➔    robot-parts.html More descriptive
  10. page3.html    ➔    contact.html More descriptive

Don’t forget to change all the references in the HTML.

3 Improve the <title> tags

The <title> tag is one of the most important pieces of content from an SEO perspective: it’s searched for keywords and it’s used on the search results pages.

We need to make nice, keyword heavy <title> tags that are completely unique on every page of the website.

<!DOCTYPE html>
<html lang="en-ca">
<head>
  <meta charset="utf-8">
  <title>Robot Parts Co. · Every part to conquer the world · Robot Mega City 1, RoboCountry211</title>
  <meta name="viewport" content="width=device-width,initial-scale=1">
⋮
<!DOCTYPE html>
<html lang="en-ca">
<head>
  <meta charset="utf-8">
  <title>Robot Parts · Robot Parts Co.</title>
  <meta name="viewport" content="width=device-width,initial-scale=1">
⋮
<!DOCTYPE html>
<html lang="en-ca">
<head>
  <meta charset="utf-8">
  <title>Contact · Robot Parts Co.</title>
  <meta name="viewport" content="width=device-width,initial-scale=1">
⋮
  1. Keyboard shortcut

    To create the middot (·) press ⌥⇧9

  2. index.html — Line E

    The homepage title follows the format:

    Site/Company Name · Small keyword rich, catchy phrase · City, Country
    
  3. robot-parts.html — Line E

    The inside page titles follow the format:

    Page Title · Site/Company Name
    

4 Add <meta> descriptions

The <meta> description tag should be included on all websites—and be completely unique on every page.

Search engines use this as the little description under the link on search results pages. Or sometimes, if it’s better, they’ll use the first few words from the first <p> tag.

Max length for the <meta> description is 150 characters.

<!DOCTYPE html>
<html lang="en-ca">
<head>
  <meta charset="utf-8">
  <title>Robot Parts Co. · Robot parts for robot hearts · Robot Mega City 1, RoboCountry211</title>
  <meta name="description" content="A mega conglomerate corporation selling and servicing all totalitarian robots.">
  <meta name="viewport" content="width=device-width,initial-scale=1">
⋮
<!DOCTYPE html>
<html lang="en-ca">
<head>
  <meta charset="utf-8">
  <title>Robot Parts · Robot Parts Co.</title>
  <meta name="description" content="Every part or piece a destructive robot needs: gears, rotors, manipulators, sensors, power cells and more.">
  <meta name="viewport" content="width=device-width,initial-scale=1">
⋮
<!DOCTYPE html>
<html lang="en-ca">
<head>
  <meta charset="utf-8">
  <title>Contact · Robot Parts Co.</title>
  <meta name="description" content="Send us digital beams or stop by the warehouse to order a part that we’re missing.">
  <meta name="viewport" content="width=device-width,initial-scale=1">
⋮
  1. Reminder

    The <meta> description tag should be completely unique for every single page on the website.

5 Fix the masthead

There’s some important information in the <header> and top of the website that could be improved for search engine friendliness.

⋮
<header>
  <h1><img src="images/robot-parts-co.svg" alt="Robot Parts Co."></h1>
  <nav>
    <ul>
      <li><a href="index.html">Home</a></li>
      <li><a href="robot-parts.html">Robot Parts</a></li>
      <li><a href="contact.html">Contact</a></li>
    </ul>
  </nav>
</header>
⋮
⋮
<header>
  <strong><img src="images/robot-parts-co.svg" alt="Robot Parts Co."></strong>
  <nav>
    <ul>
      <li><a href="index.html">Home</a></li>
      <li><a href="robot-parts.html">Robot Parts</a></li>
      <li><a href="contact.html">Contact</a></li>
    </ul>
  </nav>
</header>

<main>
  <h1>Robot Parts</h1>
⋮

Go ahead and make the <h1> to <strong> adjustment on the contact.html page too. Then change the <h2> to an <h1>.

  1. index.html — Line C

    Since the logo is inside the <h1> the alt="" attribute is extremely important—it should be the name of the company only.

  2. index.html — Line G

    The titles of pages in the navigation is really important—“Products” is too generic so we’ll change it to “Robot Parts”.

  3. robot-parts.html — Line C

    Every page must have a completely unique <h1>—that means that the logo cannot be an <h1> tag on the inside pages because it would match the homepage.

    So, on inside pages, the logo should probably be a <strong> tag.

  4. robot-parts.html — Line N

    Now that the <h1> tag has been freed from the logo, we should use it on the most important piece of content on this page: the page title.

    While we’re at it, let’s change the title to be slightly more descriptive than “Products”.

6 Homepage content fixes

Looking at the homepage there’s a few adjustments we can do to the content to make it more search engine friendly.

⋮
</header>

<main>
  <p><strong>Every part a destructive robot needs for total world domination.</strong></p>
  <img src="images/gears-power-cells.svg" alt="A selection of the best gears and power cells available to buy">
  <a href="robot-parts.html">Buy robot parts</a>
</main>

<footer>
⋮
  1. Line E

    The website introductory paragraph was really generic—we changed it to something more relevant.

    Also, since it’s an important paragraph and search engines pay attention to <strong> tags we surrounded it with a <strong> tag.

  2. Line F

    The <img> tag should have a really descriptive, keyword laden alt="" attribute to help with search results.

  3. Line G

    The text inside the <a> tag was too generic so we changed it to something more descriptive of the page we’re linking too.

7 Add location information

It’s helpful for search engines to know the location of the website, especially if it’s a physical business. They can use that information to plot on maps, etc.

There are a few tags to define the geographic location using longitude & latitude:

⋮
  <link href="css/main.css" rel="stylesheet">
  <meta name="ICBM" content="45.41117,-75.69812">
  <meta name="geo.position" content="45.41117;-75.69812">
  <meta name="geo.region" content="ca-on">
  <meta name="geo.placename" content="Ottawa">
</head>
<body>
⋮

Make sure to copy this to the robot-parts.html & contact.html

  1. Line C,D

    Add the longitude & latitude of the physical location—or even just the city.

  2. Line E

    Specify the region’s country code: ca-on means “Canada, Ontario”.

  3. Line F

    Specify the city, village, town, etc. of the location.

8 Add social media tags

When somebody shares our website on social media we can control what information shows in the share box by adding a few more <meta> tags to the head of our document.

There are primarily 2 chunks of information: OpenGraph—used by Facebook, Twitter, Instagram, Pinterest, LinkedIn & more.

(Though there are few specific ones for Twitter. Twitter actually has duplicated many of the OpenGraph tags—but they’re totally optional.)

⋮
  <meta name="geo.placename" content="Ottawa">
  <meta property="og:type" content="website">
  <meta property="og:title" content="Robot Parts Co. · Every part to conquer the world">
  <meta property="og:url" content="https://robotparts.co/">
  <meta property="og:image" content="https://robotparts.co/images/social.jpg">
  <meta property="og:site_name" content="Robot Parts Co.">
  <meta property="og:description" content="A mega conglomerate corporation selling and servicing all totalitarian robots.">
  <meta property="og:locale" content="en_CA">
  <meta name="twitter:card" content="summary">
  <meta name="twitter:site" content="@robotpartsco">
</head>
<body>
⋮

Copy this to the robot-parts.html & contact.html. It’s extremely critical that og:title, og:url & og:description are different for every page. I usually just copy from the other <meta> tags.

  1. Lines C–I

    These are the OpenGraph tags. They’ll work on just about any social media website, including Twitter.

  2. Lines J–K

    Twitter has a few of its own extra tags to add, these are the two that are kind of necessary. The rest of the data will be pulled from OpenGraph.

9 Humans.txt

The humans.txt file is a place to write the credits of your website: the designers, developers, etc.

It’s also a wonderful place to cite the location of where you got assets. If you downloaded icons from a website you need to specify where, when, etc.—like a bibliography.

  1. robot-parts-co
  2. css
  3. images
  4. index.html
  5. contact.html
  6. robot-parts.html
  7. humans.txt

Create a new file named humans.txt inside the root folder.

# humanstxt.org

# TEAM

(Write all your team members’ information in here)
(Any format you’d like—I usually use Markdown)

---

# THANKS

(This is the place where you put sources of things)
(Locations where you got icons, etc.)

---

# TECHNOLOGY

## Software

(Write what software you’re using)

## Hosting

(Write where the website is hosted & services)
  1. Lines E–F

    Configure this to specify your team members, their locations & contact information.

  2. Lines L–M

    Add the citations & bibliographic entries here. Or thank humans who helped you make the project.

  3. Line U,Y

    Use these places to tell about what tools you used: your code editor, version control, where the website is hosted, etc.

10 Robots.txt

The robots.txt file is for search engine crawlers—to prevent them from showing certain things in search results. Remember only honest search engines will follow the specifications.

For instance, if you don’t want to have your images show in image search results you could block your images folder.

  1. robot-parts-co
  2. css
  3. images
  4. index.html
  5. contact.html
  6. robot-parts.html
  7. humans.txt
  8. robots.txt

Create a new file named robots.txt inside the root folder.

# robotstxt.org

Sitemap: http://robotparts.co/sitemap.xml

User-Agent: *
Disallow:

Here’s the most basic one you should have: it says, “Don’t block anything”.

Also notice it’s pointing to our sitemap.xml—that’s the next step.

11 Search engine sitemap

The sitemap.xml file is used by search engines to help them spider your website. It’s a large list of all the pages on the website.

This is different than a user-facing sitemap. It’s specifically targeted at search engine robots & not humans. The visual sitemap you’ve interacted with previously is another thing that you should probably include.

  1. robot-parts-co
  2. css
  3. images
  4. index.html
  5. contact.html
  6. robot-parts.html
  7. humans.txt
  8. robots.txt
  9. sitemap.xml

Create a new file named sitemap.xml inside the root folder.

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

  <url>
     <loc>https://robotparts.co/</loc>
     <lastmod>2027-10-28</lastmod>
     <changefreq>monthly</changefreq>
     <priority>0.6</priority>
  </url>



</urlset>

Don’t forget to add 2 more <url> blocks:

  1. One <url> block for contact.html
  2. One <url> block for robot-parts.html

Check out Learn the Web’s sitemap.xml.

(Usually humans don’t create these by hand, they’re generated by a tool, like Jeykll.)

  1. Line A

    This code language is called XML, the structure is similar to HTML with angle brackets, etc.—but significantly more strict.

  2. Lines D–I

    Every page on the website should have its own <url> entry.

    The only tag within that is required is <loc>: the URL of the page itself—the other tags are optional.

    • lastmod: The last modification day of the page
    • changefreq: How often the page changes
    • priority: How important the page is, the default is 0.5
  3. Line K

    Add 2 more <url> blocks here for the contact & robot-parts pages.

12 Add extra metadata

The contact page has the address and contact information for the store, it’s a great idea to add extra Schema.org metadata to help computers (and robots) understand the content better.

⋮
    <p>© 2027 Robot Parts Co.</p>
  </footer>

  <script type="application/ld+json">
    {
      "@context": "https://schema.org",
      "@type": "Organization",
      "name": "Robot Parts Co.",
      "email": "parts@robotparts.co",
      "address": {
        "@type": "PostalAddress",
        "streetAddress": "1234 Robot Lane",
        "addressLocality": "Robot Mega City 1",
        "addressRegion": "RoboStateB",
        "addressCountry": "RoboCountry211",
        "postalCode": "11001010100101001"
      }
    }
  </script>

</body>
⋮

The next lesson, Portfolio metadata, goes into more depth with the extra Schema.org metadata.

  1. Line E

    Add a new <script> tag & set its type to application/ld+json

    If we don’t set the type attribute the browser will try to execute it as JavaScript.

  2. Line F

    The whole thing is a JSON object—so open & close some curly brackets.

  3. Line G

    The first entry is @context this defines that we’re using the Schema.org syntax.

  4. Line H

    Next up we’ll specify that this JSON-LD object is an Organization: one of the types defined on Schema.org.

  5. Line I

    Specify the name of the corporation.

  6. Line J

    Specify the organization’s email address.

  7. Line K

    The next entry is a sub-object called address: it has its own set of Schema.org properties we need to implement.

  8. Lines M–Q

    There are a few different definitions here, each for a distinct part of a postal address:

    • streetAddress — the building number & street name.
    • addressLocality — the city or village or hamlet, etc. — e.g. “Ottawa”
    • addressRegion — province, territory, state, canton, etc. — e.g. “Ontario”
    • addressCountry — the country of the address — e.g. “Canada”.
    • postalCode — that one’s pretty obvious.

Drop it into Markbot & submit

Drop the final, coded exercise into Markbot and fix all the errors until Markbot gives you all green (and maybe a little yellow).

After you’ve fixed all the problems, go ahead and submit the assignment. You’ll immediately get your grade.

  1. Submit

    Whenever you’ve passed all Markbot’s specific tests go ahead and submit this lesson for marks.