SEO Project Manager Susan Sisler https://dagmarmarketing.com/blog/author/susan-sisler/ Jacksonville Local SEO Company | Web Design Jacksonville & PPC Services Tue, 30 Aug 2022 16:06:28 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.2 https://dagmarmarketing.com/wp-content/uploads/2019/10/cropped-dagmar-favicon-32x32.png SEO Project Manager Susan Sisler https://dagmarmarketing.com/blog/author/susan-sisler/ 32 32 Noindex Meta Tags vs. Robots.txt: Which Should You Use? https://dagmarmarketing.com/blog/noindex-meta-tags-vs-robots-txt/ https://dagmarmarketing.com/blog/noindex-meta-tags-vs-robots-txt/#respond Thu, 13 Jul 2017 16:18:49 +0000 https://devdag.wpengine.com/?p=4956 Even those who’ve been in the SEO business for a while can get confused about whether to use noindex meta tags or robots.txt files to control how web pages are “seen” (and whether they should appear in search results) by search engines. We wrote in this post about some of the reasons to use robots.txt […]

The post Noindex Meta Tags vs. Robots.txt: Which Should You Use? appeared first on DAGMAR.

]]>
noindex meta tag vs robot txt

Even those who’ve been in the SEO business for a while can get confused about whether to use noindex meta tags or robots.txt files to control how web pages are “seen” (and whether they should appear in search results) by search engines.

We wrote in this post about some of the reasons to use robots.txt files on certain pages and these apply to the use of noindex tags as well. That’s about it for similarities between using robots.txt and noindex tags, though, as you’ll see.

What’s the difference?

In the very simplest terms:

  • A robots.txt file controls crawling. It instructs robots (a.k.a. spiders) that are looking for pages to crawl to “keep out” of certain places. You place this file in your website’s root directory.
  • A noindex tag controls indexing. It tells spiders that the page should not be indexed. You place this tag in the code of the relevant web page. Here is an example of the tag:
    <meta name=”robots” content=”noindex,follow”/>

When to use robots.txt.

Not all content on your site needs to be or should be found. There are instances in which you may not want to have sections on your site appear in search results, such as information meant only for employees, shopping carts, or thank-you pages.

Use the robots.txt file when you want control at the directory level or across your site. However, keep in mind that robots are not required to follow these directives. Most will, such as Googlebot, but it is safer to keep any highly sensitive information out of publicly-accessible areas of the site.

When to use noindex meta tags.

As with robots.txt files, noindex tags will exclude a page from search results. The page will still be crawled, but it won’t be indexed. Use these tags when you want control at the individual page level.

An aside on the difference between crawling and indexing: Crawling (via spiders) is how a search engine’s spider tracks your website; the results of the crawling go into the search engine’s index. Storing this information in an index speeds up the return of relevant search results—instead of scanning every page related to a search, the index (a smaller database) is searched to optimize speed. If there was no index, the search engine would look at every single bit of data or info in existence related to the search term, and we’d all have time to make and eat a couple of sandwiches while waiting for search results to display. The index uses spiders to keep its database up to date.

Let’s be careful out there!

As we warned in our post on robots.txt files, there’s always the danger that you may end up making your entire website uncrawlable, so pay close attention when using these directives.

The post Noindex Meta Tags vs. Robots.txt: Which Should You Use? appeared first on DAGMAR.

]]>
https://dagmarmarketing.com/blog/noindex-meta-tags-vs-robots-txt/feed/ 0
Improve Your SEO Results With a Robots.txt File https://dagmarmarketing.com/blog/improve-seo-with-robots-txt-file/ https://dagmarmarketing.com/blog/improve-seo-with-robots-txt-file/#respond Thu, 06 Apr 2017 16:15:53 +0000 https://devdag.wpengine.com/?p=4818 Take control of how the pages on your site are (or aren’t) accessed by search engines by creating a robots.txt file. The goal is to block certain pages or parts of your site from search engine crawlers, which may seem counterintuitive — you want search engines to find your web pages, right? — but it’s […]

The post Improve Your SEO Results With a Robots.txt File appeared first on DAGMAR.

]]>
improve SEO with robots-txt

Take control of how the pages on your site are (or aren’t) accessed by search engines by creating a robots.txt file. The goal is to block certain pages or parts of your site from search engine crawlers, which may seem counterintuitive — you want search engines to find your web pages, right? — but it’s a solid SEO tactic.

Here’s a quick guide on how and when to create a robots.txt file, and why it’s so important to get it exactly right.

How the robots.txt file works.

Search engines send “spiders” to gather information from your site. This information is used to index your web pages in search results where users can find them. These spiders are also referred to as robots.

When search engine spiders visit your site, the robots.txt file tells them which sections or pages on your site that they shouldn’t crawl. Some of the types of content you may not want spiders to crawl include these examples:

  • Duplicate content: A printer-friendly version of a page or a copy of a manufacturer’s description of a product you sell.
  • Non-public pages: A development or staging site.
  • Pages along the user path: A page displayed to site visitors in response to actions they’ve taken, such as a thank-you page.

Creating a robots.txt file.

You can find your robots.txt file in your site’s root directory. Google Search Console has step-by-step instructions for creating or editing an existing robots.txt file here, which includes a tool for testing your file to verify that your URL has been properly blocked. Once you’re done, make yourself a note on your calendar to regularly review the file to make sure that, as your site changes in any way, it remains properly configured.

Also keep a few things in mind:

  • Be very careful—there is the chance that you could disallow access to your entire website with just a few minor mistakes.
  • According to Google in 2014, “Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.” Leave Javascript and CSS files unblocked—if you have a WordPress site, these files can be found in /wp-admin/, so be sure not to disallow that folder.
  • Important: Configuring a robots.txt file does not ensure that the blocked URLs won’t become indexed. This could happen if the search engines discover the URL in a different way—generally by following a link to the disallowed page. Using the noindex meta tag is the only foolproof way to prevent the page from being indexed.

Search engines will still have the ability to crawl your site without a robots.txt file, but taking the time to make one supports good SEO results and lets you more tightly control how your site is accessed and crawled.

The post Improve Your SEO Results With a Robots.txt File appeared first on DAGMAR.

]]>
https://dagmarmarketing.com/blog/improve-seo-with-robots-txt-file/feed/ 0
Ten Fun and Creative Custom 404 Pages https://dagmarmarketing.com/blog/ten-fun-custom-404-pages/ https://dagmarmarketing.com/blog/ten-fun-custom-404-pages/#respond Tue, 05 Apr 2016 13:30:36 +0000 https://devdag.wpengine.com/?p=4477 A few weeks ago I blogged about the purpose of http error 404 and the importance of using a custom 404 page to direct your visitors and keep them on your site should they happen upon a broken page. Since then I have come across a number of sites that use custom 404 pages in […]

The post Ten Fun and Creative Custom 404 Pages appeared first on DAGMAR.

]]>
Fun Custom 404 Page

A few weeks ago I blogged about the purpose of http error 404 and the importance of using a custom 404 page to direct your visitors and keep them on your site should they happen upon a broken page. Since then I have come across a number of sites that use custom 404 pages in very creative ways, so I began searching for more.

Please enjoy this showcase of ten sites with fun custom 404 pages!

Pet Sites’ 404 Pages

I have discovered that people who love animals and have pet-related sites can be very cute and creative with their custom 404 pages. Who can resist these adorable creatures and clever commentary?

HillsPet.com

“Well, Woof. This page doesn’t exist. Don’t go around in circles, we’ll help you find what you’re looking for.”

404 Error HillsPet.com

EmbracePetInsurance.com

“Our dogs ate this page… Naughty pups.”

404 Error EmbracePetInsurance.com

ASPCA.org

“Oops! Page Not Found. We’re going to assume you were looking for a way to help animals! Here are three ways you can help right now!”

404 Error ASPCA.org

Other Sites’ 404 Pages

These five sites certainly show they have a sense of humor. Featured sites are from a broad range of topics: travel, gardening, department store, HVAC — and even a Catholic church site.

Flight001.com

“Uh oh! Looks like you’ve encountered some turbulence. This page cannot be found. Please pick another destination below…”

404 Error Flight001.com

GardeningKnowHow.com

“This is a dead page. To keep your plants from becoming like this page, please search our site for the information you were looking for.”

404 Error GardeningKnowHow.com

Kohls.com

“We couldn’t find a match. The page you’re looking for has been moved, deleted or can’t be found, just like that sock. For additional help (but not with your laundry), contact us.”

404 Error Kohls.com

FHFurr.com

“We hate to break it to you, but just like Santa Clause [sic], the page you’re looking for doesn’t exist!”

404 Error FHFurr.com

SSFrancisJohn.org

“Dear St. Anthony, Please come around. This page is lost and cannot be found.” – 404 Prayer to St. Anthony (patron saint of lost things)

404 Error SSFrancisJohn.com

I hope you’ve enjoyed this selection of fun custom 404 pages, and that you’ve been inspired to create your own original page! Share yours or others you’ve found in the comments below!



New Call-to-action

The post Ten Fun and Creative Custom 404 Pages appeared first on DAGMAR.

]]>
https://dagmarmarketing.com/blog/ten-fun-custom-404-pages/feed/ 0
Conversions have tanked — or did your goal URLs change? https://dagmarmarketing.com/blog/conversions-have-tanked-or-did-your-goal-urls-change/ https://dagmarmarketing.com/blog/conversions-have-tanked-or-did-your-goal-urls-change/#respond Fri, 25 Mar 2016 15:40:07 +0000 https://devdag.wpengine.com/?p=4461 We’ve seen it time and again. A nicely redesigned, perhaps reorganized, website. Maybe a new platform. New user-friendly URLs. All is well — yet conversions have dropped to zero since the relaunch. Or have they? What’s going on? How destination goals work In Google Analytics, there are five goal-type options to track your site’s conversions […]

The post Conversions have tanked — or did your goal URLs change? appeared first on DAGMAR.

]]>
We’ve seen it time and again. A nicely redesigned, perhaps reorganized, website. Maybe a new platform. New user-friendly URLs. All is well — yet conversions have dropped to zero since the relaunch. Or have they? What’s going on?

Goal Completions

How destination goals work

In Google Analytics, there are five goal-type options to track your site’s conversions (also called goal completions). This article focuses on the first option, the destination goal:

Goal Type Options

A destination goal works exactly how it sounds: when the user performs an action on your site, such as clicking a button to submit a form, he or she is sent to a “destination” page — generally a page that thanks them for their action (such as contacting you). This “thank-you page” has its own unique URL and is entered into Google Analytics as the destination goal. This tells analytics to record a conversion when someone visits the thank-you page.

Goal Destination

This is one of the simplest ways to track conversions. So, what’s the problem?

How search engines read URLs

Search engines “read” URLs character-for-character. Therefore, /thank-you.html is a completely different page from /thank-you/.

Let’s say your URLs all had the .html extension before the redesign. The extension has been removed and your new thank-you page is located at /thank-you/. You are probably still getting form submissions — but they will no longer be recorded in analytics unless you have updated your destination goal to the new URL.

Note that if proper 301 redirects have been put in place from all the old .html URLs to the new URLs, this may not be a problem. In our experience, however, thank-you pages can tend to be overlooked in the redirect process.

Equals To vs. Begins With

Be careful when choosing between these options for your destination URL. “Equals to” means the conversion will track only if the visited page equals exactly the URL you have entered. “Begins with” will capture conversions on all pages that start with that filename but may be followed by other characters or query strings, such as /thank-you/?eid=2219. Be sure to verify goals are tracking correctly in analytics.

Goal Equals To

Update those goals!

Goal tracking is a manual process and if anything changes on the site, those changes need to be updated in your Google Analytics goals. This is a step often forgotten in the redesign process — and in fact, errors can happen at any time on your site, not just during a redesign, as developers make necessary changes. Remember to check your goals often and update as needed.

Helpful hint: Check out this article if you’ve lost traffic after a redesign.



New Call-to-action

The post Conversions have tanked — or did your goal URLs change? appeared first on DAGMAR.

]]>
https://dagmarmarketing.com/blog/conversions-have-tanked-or-did-your-goal-urls-change/feed/ 0
Http Error 404: It’s Not Always Bad! https://dagmarmarketing.com/blog/http-error-404-its-not-always-bad/ https://dagmarmarketing.com/blog/http-error-404-its-not-always-bad/#respond Wed, 02 Mar 2016 17:20:32 +0000 https://devdag.wpengine.com/?p=4373 As an SEO, I really dislike seeing http status 404 errors on a site, also known as 404 “page not found” errors. You know, those empty pages that don’t contain what you expected to find. Happening upon page not found errors can be frustrating, especially if they are not handled properly on the site—more on […]

The post Http Error 404: It’s Not Always Bad! appeared first on DAGMAR.

]]>
“Custom

As an SEO, I really dislike seeing http status 404 errors on a site, also known as 404 “page not found” errors. You know, those empty pages that don’t contain what you expected to find. Happening upon page not found errors can be frustrating, especially if they are not handled properly on the site—more on that subject later—but they are there for good reason.

Why use http status 404?

Believe it or not, the 404 status code serves an important purpose: to tell Google and visitors that the page no longer exists and, equally important, that it has not moved to another location on the site. The page is simply no longer available. This is a perfectly acceptable, and correct, use of the 404 error code.

Google expects to see some 404 errors on a normal-functioning site; however, large and increasing numbers of 404s can be a red flag and your site may receive a warning from Google in the Search Console if this occurs. Such a message looks like this:

404 Error

Note the last line: “If these URLs don’t exist at all, no action is necessary.” Google spells it out for us right there. Despite this, webmasters often feel compelled to “fix” these 404 errors using methods that are not recommended, such as:

  • Marking 404 errors as “fixed” in the Search Console simply to remove them from the crawl errors list. Rest assured, Google will find them and add them back if they have not actually been resolved. If a 404 page has been properly redirected, however, it may then be marked as fixed.

Mark As Fixed

  • Redirecting removed pages to the home page. A redirect tells Google the new URL where content resides and it should be a close match, with similar or identical page content as the original URL. Doing otherwise causes confusion for the visitor, who unexpectedly ends up on the home page rather than the intended page. This method also causes soft 404 errors to occur.

 

What are soft 404 errors?

Soft 404 errors are pages that display unexpected content, either by redirecting to an incorrect page, as in the home page example above, or by showing page-not-found type content on a page that doesn’t return the correct http status 404. Soft 404 errors are listed in the crawl errors section of the Search Console, and any significant increases will be reported by Google:

Soft 404 Error

Excessive soft 404 errors also waste crawling resources on non-existent pages and can create a poor user experience. These URLs should be changed to a 404 status code if they cannot be redirected to new, equivalent pages.

The importance of a custom 404 page

While 404 page not found errors are inevitable, especially on larger sites, creating a custom 404 page will help keep visitors on your site. The page should contain the same site layout and menus and provide helpful verbiage and links to direct the visitor, as in this example:

Custom 404 Page

By contrast, here’s a bad example of a 404 page not found that could cause your site to lose visitors:

Server Error

Best practices for 404 errors

Follow these tips for handling crawl errors on your site:

  • Both 404 errors and soft 404 errors should be checked periodically in the Google Search Console to see if any can be redirected to new pages
  • If a page no longer exists and cannot be redirected to a logical new page then the 404 status should be implemented
  • Never redirect removed pages to the home page
  • Always employ a custom 404 page with useful links to guide visitors

Contact the SEO professionals at DAGMAR Marketing to ensure your site is being crawled completely and efficiently.



New Call-to-action

The post Http Error 404: It’s Not Always Bad! appeared first on DAGMAR.

]]>
https://dagmarmarketing.com/blog/http-error-404-its-not-always-bad/feed/ 0