Another Google update has been announced by Matt Cutts, head of Google’s web spam team and our face for all Google algorithm information. This one has been lovingly named “Google 2.0″ courtesy of the internal Google team. As always we don’t know how much the changes will affect your rankings, and as always the affects are determined by your SEO practices.
Below is the a short YouTube video by Matt Cutts on Google 2.0. Keep reading for what this all means for SEO, both recovering bad past work, and moving forward with white-hat ethical SEO from here.
For the best of every business, I hope that your employed SEO company is bringing you forward thinking SEM tactics, such that you are not worried at all after watching that video. If you are unsure at all, please contact us and have us evaluate your current tactics and link-profile.
With every change there is momentum towards great loss for some, and great opportunity for others.
Too often I am seeing people write about what needs to change in order to abide by the new “algorithm law” and if great SEO/SEM companies were to focus on what really matters, we might never care about algorithms given our unconditional affection to drive a flawless user experience for your industry.
To focus on the good before the bad, there is a great opportunity for websites who are an authority in their industry to jump in rankings. With the highest of hopes we would love to hear that you have moved forward with white-hat link building practices to not deter your authoritative site from jumping.
Matt Cutts said, We’re doing a better job of detecting when someone is more of an authority on a specific space… And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.
Hopefully, that is great news for your business!
This update is also great news if you have been engaged in good link-building practices, all the websites which are buying into link farms or the “too good to be true” promotions of 10,000 unique links built every month should be greatly affected by this update. No one knows how substantial the effects may be.
I like that Google has been updating and advancing tools within Google Webmasters.
That being said, more functionality will be added to Google Webmasters:
Google is going higher up in the chain and beginning to deny value to link-spammers and companies engaging in these practices.
Google Penguin has always been about link-building and preventing webspam.
If you dominate a particular keyword, such that you have multiple listings show up on a page which are referred to as “clusters of listings” your website will show up less and less as you go more pages back. For example, it is highly desired to have clusters of listings on the first page because you will more than likely receive more clicks due to a great amount of positions you hold both organic and AdWords.
But, as a user goes back deeper into listings it will become more rare they will see your site.
This change, I assume is due to creating a better user-experience. For example, a user would only be driven to dig deeper into listings on page 2+ if the dominant site (cluster of listings) is not helpful for their query. Therefore, as you dig deeper down into listings, it would create a less useful experience by displaying more listings from the same website.
We are excited to see what the results of this algorithm are. One beacon of hope for many websites Matt Cutts says if you are engaged in high-quality link building practices then this algorithm is not an issue.
If you are concerned with your current SEO company or practices, please contact us and we’d love to help evaluate if you are engaged in white-hat ethical SEO practices with a forward thinking mindset.
Please join our Facebook page to keep up to date with Google. Join Here
What Are 301 Redirects and Why Do They Matter
301 Redirects are the most important action that can be taken when moving content from one URL to another while maintaining PageRank as well as search rankings. If done properly, the value will be transferred over slowly, meaning the value from the old URL will not be apparent immediately after the redirect has occurred. Through the slow transfer of old value to a new URL, bots are able to determine if the new page correlates in context with that of the old URL. If the context between the two URLs is not the same, the value should not be switched over since they are not, in fact, the same.
Proper use of 301 redirects increases user experience by not showing a 404 error when attempting to land on a page that is not available. A 404 error page for a user essentially means they’ve found a dead end and need to choose a new path. It is also important for your overall site health to fix any 404 errors. This can be done by either setting the Robots.txt file to no-follow for that URL, or implementing 301 redirects to the correct functioning URL the user would have intended to land on.
301’s, beyond increasing the user experience and overall site health, tells search engines that any link juice being given to the old URL should now be given to the new URL. In essence, it makes it so you don’t have to do the work of building up PageRank for a URL from the start, since some previous value of the old URL will be transferred over to the new one.
There is no limit to the number of 301’s that can be done for a website, however, Google does limit the number of “steps” it will take down a 301 path. For instance, let’s say you want site A to redirect to site G. It’s best if you simply redirect A to G, and Google bots will even follow you if you redirect site A to B to G. However, Google bots will not follow site A to B to C to D to E to F before finally reaching G. Google bots understand this is excessive and will simply end the silly goose chase. These excessive redirects gives warning that something fishy or Black Hat is occurring since rarely is there a need to redirect a site 2 or 3 times before reaching the actual intended URL.
How To Do 301 Redirects in WordPress
Considering that there are different ways to do 301 redirects for different development programs, we are just going to focus on one, that being WordPress. Essentially, the two methods that are used for WordPress to do redirects are the Redirection WordPress Plugin, or manually redirecting from the .htaccess files.
WordPress Plugin Redirection: This plugin found on the wordpress.org site is one that manages 301 redirections and tracks any 404 errors that may be present on your site. This plugin is especially useful when attempting to migrate pages from an older site or when changing the directory location of your WordPress installation files. The WordPress Plugin redirection can be found Here.
If you view the features included in the plugin, you will soon realize its awesome versatility and how it makes 301 redirects quick and easy. It also gathers information that would otherwise need to be done manually, such as giving statistics on how many times a redirection of a specific URL has taken place, the dates and times when those redirects occurred, who issued the redirect, and where that URL is being found which generates the traffic to your site. Overall, it’s a rather useful and crafty tool, one where if you have a WordPress website, it would behoove you to install the Redirection plugin. The only word of warning for the plugin is that if there is a WordPress update, and the plugin was not updated by the developer along with it, the plugin may very well fail you. In that case you would have to do the redirects through the .htaccess file manually, which is what we will get into next.
.htaccess File For Static Redirects: For those of you who would rather do it manually, perhaps for the sake of thoroughness, perhaps so as not to rely on plugins, but either way, it is definitely a bit more difficult to write them all in by hand as well as time consuming. With that said, let’s jump right into it by first discussing a few important aspects of using .htaccess to do redirects.
Once these above points have been recognized, it’s time to do the redirects themselves. To start you open up the .htaccess file. At the end of the file, or next to other redirects if they are present, you write in exactly how it is written below, so as an example:
redirect 301 /(error URL)/ /www.url.com/(redirected URL)/
Notice the space between /(error URL)/ and /www.url.com/(redirected URL)/, without that space, you would essentially be saying the URL you want redirected is /(error URL)//(redirected URL)/ and did not state the where it is to be redirected. Without including the space, you will certainly receive a 500 error.
You then continue through all of the errors that are coming up in this manner, going through them one by one, and checking to make sure each has been redirected properly. It’s simple enough, but can become a large hassle the more errors there are to deal with.
If you need help with 301 redirects, improving your site health, or any other SEO concern that may arise, and are looking for an SEO consulting firm to assist you, be sure to contact CustomerParadigm.com right now! We’ll be able to help you whether you’re an old and/or established website looking for improvements, or a fresh and brand new one looking to start things off on the right foot. We have the experience and knowledge necessary to deal with any and all SEO problems, helping you to rank better for specific keywords, receive more traffic, increase brand awareness, and allow for better opportunities to increase conversions.
It’s nice to look back twelve months and take a look at the numbers.
For one of our SEO clients, here’s a quick snapshot of the increase in traffic:
Here is an overview of the increase in traffic year over year, for January 2012 – January 2013:
For years, the kind of cutting-edge industrial emission control technology which AeriNOx specializes in was only produced and installed by companies based in Europe, especially Germany. Europe has had more stringent industrial emissions requirements than the U.S. for many years. However, greater awareness of the impact of industrial emissions in the U.S. has led to more stringent emissions requirements for U.S. industries, and necessitated the kind of equipment and know-how that AeriNOx possesses. For more information visit AeriNOx-Inc.com.
2011 marked a year of drastic change for Internet Marketing, and 2012 promises to be no different.
From Google’s Panda update last year that changed the rules on Search Engine Optimization, to increased competition for CPC marketing and the launch of Google+ in social media, the pace of change is accelerating.
This next series, Top Internet Marketing Trends for 2012, will explore in detail what these changes mean for businesses and organizations, and what you can do to make sure you stay ahead of the curve.
Today’s tip talks about how Google is using approximately 12,000 humans to evaluate the quality and experience of different websites, and what you need to know to survive the evaluation process.
I hope you enjoy the series!
P.S. Here is a picture of an Ibex I took on a recent trip to Israel.
Photo of an Ibex taken in Israel. View more images from Israel>>
Internet Marketing Trend #1 for 2012: Google’s 12,000 Human Website Evaluators
Recently leaked documents confirm that Google employs a huge number of people that do nothing but visit websites and evaluate them to help improve Google’s search ranking algorithm. While definitive numbers aren’t available, industry estimates estimate that Google (through several subcontracting companies) employs between 12,000 – 15,000 people, who work from home for between $10 – $12 per hour. How many sites are being reviewed?
It’s difficult to know. But if one human reviewer visits 2 websites per minute (one site every 30 seconds), and works 2 hours per day, five days a week for 50 weeks of the year, they would be able to review 60,000 websites over the course of the year. Scaled to 12,000 people, that means that humans could sift through 720 million websites per year. The cost of paying 12,000 people to work 2 hours a day, 5 days a week for a year? About $60 million. Which sounds like a lot of money, but Google’s revenue exceeds $30 billion.
So why is Google paying so much money to review websites? Google’s mission from the beginning has been to provide their users with the most useful, most relevant information possible. The human website reviewers offer a way to test the Google search algorithm, and make sure that what appears high in Google search results are relevant sites that (a) are not spammy, and (b) are useful to end consumers.
According to SEO Moz, these humans are “Google’s fact checkers – the people who work to make sure the algorithm is doing what it’s supposed to do. Data from [human] quality raters not only serves as quality control on existing , but it helps validate potential algorithm changes. When you consider that Google tested over 13,000 algorithm changes last year, it’s a pretty important job.”
How Are Sites Ranked? Sites are ranked according to how useful they are to the end user. If your site has unique, well-written content that educates people, it will do well. If your site is mostly comprised of vague marketing messages (“We have solutions”), or serves just to promote your product, it won’t do as well.
How Can You Survive the Review Process?
Google’s fact checkers like to see sites that:
If your site follows these guidelines, you should be fine.
12. Measure & Track Your SEO Efforts How do you know if your search engine optimization is working?
The quick answer is this: You need to measure and track how people come to your site. If someone makes a purchase from the site, or fills out a contact form, you should be keeping track of the search engine queries that they used to get there. This is information that your website is probably already collecting, but you’re likely not using on an individual basis for each person. When someone comes through our website and fills out a contact form, we are able to track exactly what search terms they used.
We then can do roll-up reporting on our search engine optimization efforts, to know what new leads came in from SEO. If you’re doing Google Adwords or other CPC advertising, it’s easy to measure and track the conversions. Google’s tracking system makes it easy, as you are paying on a cost per click basis each time someone clicks on your keywords. However, with a natural search engine program, it can be a little more difficult.
What we like to do is track 10-15 top keywords, and see how they change in the search results each month.
Then we track and measure what’s working (new pages, added content, new inbound links), and try to enhance the results even more. The issue is that SEO is a zero sum game. If you’re not at the top of the rankings, but one of your competitors is instead, you’re going to lose out. SEO is a constantly shifting game. What worked last year or last month won’t necessarily work next week, as other sites add content and better optimize their sites.
11. Avoid Industry Jargon.
One of our clients, NewStripe, makes machines which paint the lines on football and baseball fields. Within the industry, the machines are known as wet line markers (or dry line markers). But customers don’t often use these terms. Instead, a typical customer might search for: ”Machines to put stripes down on athletic field”(Newstripe.com is the #2 search result.) Or, they might search for: “painting stripes on your athletic field” (Newstripe.com is the #1 search result.)
Does your site copy and content reflect the language a potential customer will use in a search?
If not, a prospective customer will either (a) have to learn the industry lingo in order to find you, or (b) visit your competitor’s site. Option B is a lot more likely.
So how can you tell if your site is using too much industry jargon?
1. First, ask your current customers to take a look at your marketing materials and website. It’s a great way to engage satisfied customers without trying to sell them anything. Most people are flattered when you ask them for their opinion.
2. Second, ask someone who knows very little about your industry to read through your site, and see if they can figure out what your company does for a living. If they are confused, then it’s likely your potential customers will be confused as well.
3. Third, pay attention to how the press covers your industry. Reporters try to communicate broad ideas, and try to cut through esoteric terminology.
10. Customize Your Website with Visitor’s Search Keywords
Okay… Your site is now optimized for search engines and have you have people coming to the site. Now what? You want to make the experience as welcoming and easy as possible. Otherwise, a visitor won’t be able to find what they’re looking for, will get frustrated and then will leave (never to return). A common problem is that while websites can be optimized for search engines, they aren’t always optimized for their human visitors. This week’s strategy takes the person’s keywords that they used to find your site, and then show them content based on their request. So, for example, if you search for “personalized URL” in Google, we show up #2 in the results:
If you then click on this link, the flash animation at the top of the site will read, “Searching for personalized url? Click Here or Call 303.499.9318.”
We do this by taking the search string in a person’s browser and dynamically passing that to our system. We can also dynamically insert content on the page, based on these keywords as well. Does it work? We’ve measured a 30% increase in people filling out a contact form or calling us since we implemented this on the site a few weeks ago.
9. Reverse Archaeology
Archeology, of course, is the systematic method to uncover artifacts from the past that have been buried or forgotten.
One of our more famous clients is an archeologist who has been labeled a “real life Indiana Jones.” He’s author of a new book, Ten Discoveries that Rewrote History. What he and other archeologists do is examine artifacts that were lost and buried, and draw conclusions about how life was lived hundreds or thousands of years ago. When people search online, it’s a lot like sifting through thousands of years of junk and broken pieces of pottery in order to find the one intact tablet that solves your mystery. If you’re like me, you often see a lot of non-relevant results returned when you do a web search.
Google, as good as it is, isn’t perfect. You scan down the page, looking for the answer to your question. And then you suddenly see a link to a site that matches exactly what you’re looking for. Reverse archeology, applied to the Web, is a process that allows you to plant key information for people to find and discover. The article in the New York Times last weekend was an example of a reporter looking for fun, cool, hip and trendy information about Boulder. I’m not very hip, but my wife is. So when the reporter did a search for something like, “spiritual skiing” in Google, her site came up first in the list. I tagged along for the hike and interview, squeezed my way into the article, and gained the new label “Web site Guru” by the New York Times.
How does the process of reverse archeology work? Essentially you come up with keywords that your target audience is likely to use in a keyword search, and then you create relevant content on your site. When prospective customers or the press “digs” through the mass of web pages in a Web search, they can find your site quickly and easily. Reverse archeology is a different type of mindset for generating website content, but one that can be extremely successful.
8. Age & Experience Matter.
Google’s continuing mission is to deliver the best search results possible to its end users. So how can a search engine like Google differentiate between a company that is brand new (and might be a fly-by-night operation) versus an organization that has years of experience in the field?
The answer: Google looks at the age of your domain name (along with several other varibles). If your domain name was registered last week, chances are good that your site won’t even appear in Google’s rankings for several months. (This is called the Google Sandbox.) But if your domain name was registered eight years ago, Google uses this information as a clue that you’ve been around for a little bit.
In a nutshell, Google looks at the month and year when your domain was registered — and uses this to give more weight to companies that have been around for several years versus several weeks. Google also looks forward to see how long you have registered your domain; if you have registered your domain for five or ten years in the future, you’ve made a subtle, yet important economic decision that you’re still going to be around and in business in 2018. Here’s an instance where being frugal with domain name registration can actually hurt your rankings.
That said, Google does place more weight on the past versus the future. I have one domain that’s been continously registered for 14 years (since 1994). Along with many other factors, this site, www.rmiug.org, has a very high Google Page Rank of 6/10. So, what can you do to increase your rankings?
First, make sure that your company’s domain name is registered for at least a few years from the present date.
Second, make sure you know who is the contact person for your domain name. We’ve recently seen several instances where the person in charge of the domain name moves to a different organization, goes on vacation, or even passes away. And then, if the domain name comes up for renewal, it can be a mad scramble to keep your website and corporate email up and running.
7. Naming Your Images for SEO Success
Even though search engines can’t read words inside graphics, they do use the name of the file and other contextual information to increase your rankings. One of the biggest missed opportunities is not naming images with search engine optimization in mind. I can’t tell you how many times I see a site that has the logo named: logo.jpg While that’s sufficient to display the logo in a browser, it’s much better to name the logo with descriptive keywords, such as: customer-paradigm-logo.jpg.
Another way to look at this is to look at this image name out of context: pass-med-425.jpg The image name, pass-med-435.jpg doesn’t tell you much about what is in the image. However, this image does seem to give a search engine a bit more information: passover-in-moab-utah-2008.jpg If you search for “passover in moab” in Google, see what comes up first.
If you want to further increase the relevancy, you can create a folder (also with keywords) that can help you increase keyword density on a page. For example, placing an image in a directory like this will give you more relevancy than in a more non-descriptive folder: /search-engine-optimization-services/seo-services-header-logo.jpg. Yes, it takes a little bit more time and effort for someone to type out a longer image name and keep it organized into different folders on your website. But our research has found that increasing the relevant keywords in your images is a sure fire way to increase your search engine rankings.
6. Keyword Density
What is keyword density? It’s a percentage which is calculated this way:
Number of times keyword appears on a page / Total word count on page = Keyword Density.
Keyword density is usually displayed as a percentage. So, if you have a page that has 100 words on it, and you have a keyword appear 5 times on the page, your page would have a keyword density of 5%. (5 / 100 = 5%) In a real life example, the search term “personalized URL” has an overall keyword density on this page of 3.05%:
Click the graphic above to see a live sample of a keyword page. (15 instances of the keywords / 981 total words on the site = 3.05%) However, not all keywords on a page are treated the same. Keywords in the title tags, page name and section headings are often given higher weight than keywords that appear in the regular content area of the page. Here’s how the keywords break down in the different areas of the site: Description: Keywords: Total: Percentage: Title Tag 1 8 25.00% Page Name 1 3 66.00% Linked Text 1 61 3.27% URLs in Links 2 237 1.68% Visible Text 7 627 2.23% Total 15 981 3.05% This keyword term currently has a ranking of #2 in Google:
Click the graphic above to see the rankings. So, how much keyword density is too much? It depends on which study you read, but it’s generally best to keep your keyword density between 3-6%. Anything more, and you’ll be penalized for trying to spam the search engines. As a general rule of thumb, if the copy of the site makes sense to a human reading it, you should be fine. But if you repeat the same keyword five times in a row (Personalized URL, Personalized URL, etc), then you can be banned from search engines or penalized.
5. Why Sitemaps are Baby Food
For Search Engines Just a few years ago, the philosophy about sitemaps went something like this: If your customers need to use a sitemap to find their way around your website, you haven’t done your job organizing your content and creating a navigational system that is easy to understand.
But sitemaps are now back in favor. Why? It’s less about human visitors and more about search engines. What is a sitemap? A sitemap is page that lists all of the other pages on your site, usually in bulleted list. Here’s an example of a sitemap:
Click the graphic above to see a live sample of a sitemap page. As I’ve discussed before, search engines are easily confused. Many pages of a website are often ‘hidden’ behind tricky menus or drop-down lists. Or, the links to reach a specific page are too deep (i.e. more than a couple of pages down from the home page). A sitemap, linked from the home page of the site, will list every page of your site in one convenient place. When a search engine visits your site map, it’s very easy for them to then get a list of every page on your site, and then crawl, digest and include all of your content in their system. We generally recommend having the link to your sitemap on the bottom footer navigation of your site.
Click the graphic above to see a live sample sitemap link. But you need to make sure that as your site changes, your sitemap is updated. Otherwise, Google and others may not index the latest pages placed on your site. Our PageDirector system, for example, automatically updates the sitemap each time a page is added, or the name of a page is changed. And even better than an HTML sitemap is an XML sitemap. An XML sitemap is a sitemap that is specifically formatted for search engines like Google. It’s a machine-readable version that allows you to specify all of the pages of the site.
Click the graphic above to see a live sample of an XML sitemap. Adding an XML sitemap ensures that a site will get indexed much more quickly and more rapidly than not using this method at all. For the new site, God in the Wilderness.com, (a site devoted to my wife’s book that will be published by Random House’s DoubleDay Religion on April 8, 2008), the XML sitemap allowed the site to be indexed in 3-4 days vs. the usual 3-4 months. (But then again, it’s my wife… so of course she gets all of the top-shelf website development stuff.)
4. What does a search engine look for?
At the end of the day, a search engine is in business to help you find the most relevant results possible when you conduct a search. Search engines make their money by selling relevant advertising to supplement the natural, organic search results. Because a top ranking in Google or another search engine can translate into a great deal of business, it’s important to know how search engines determine who gets placed at the top of the list.
One big way search engines rank you is based on: Relevant Content: Search engines are really good at reading text. The more relevant copy you have on your site, the better chance you have getting your page indexed. Search engines love pages that have more than 500 words of text on them. Why? A page with a lot of content is usually more beneficial to the end user. (Though for every rule like this one, there are many exceptions.) Adding articles, press releases, detailed information about your products and services all can help quickly increase the amount of relevant content that you have on your site.
Another way search engines rank you is: Inbound Links: The more sites that link to you, the more important your site becomes to search engines. If sites that link to you are very relevant or important, those inbound links are worth more. And domains that end with .gov, .edu often perform better than .com for inbound links. It’s like a high school popularity contest. If the most popular kids all point to you and say that your website is better than anyone elses, in the eyes of the community, your ranking is elevated. There are many other things that affect search engine ranking. I can’t go into great detail for the entire list, but even small changes can translate into higher rankings.
Some other factors that affect your search engine ranking:
Title Tags: See last week‘s email.
Page Names: Keywords in page names in crease the relevance of the search and are displayed in a Google search result.
Image Names: Putting relevant keywords into image names helps your ranking.
Alt Text for Images: If you hover over an image, this is the text that appears; also used by the blind to understand what an image represents.
Keyword Density: How often specific keywords appear on a page as a percentage of all of the words on a page. Section Headings: In the HTML code, section headings like H1 or H2 are treated as more important content than the information on the rest of the page.
Words contained in links: A link like: “Customer Paradigm offers Web Marketing and Search Engine Optimization Services” can help boost rankings. Clean HTML code: Search engines are easily confused if your websites’ code is a mess. How often pages are updated: Search engines like new conent, but also have a bias toward pages that have been up on the web for a long time.
Site Map: If you have a site map (and an XML site map as well), it’s easier for search engines to crawl through all of the pages of your site.
Keywords in your domain name.
The age of your domain name: Older domain names are perceived as more relevant than something registered last week.
Keywords in subdomains (i.e. http://email.customerparadigm.com)
Keywords in file directory structures (i.e. http://www.customerparadigm.com/email)
3. Title Tags & Why They Matter
When you search in Google, the search results on the next page each start with a blue underlined link. Here’s an example:
What displays in this blue link is usually what is contained in the title tag of a web page. The keywords you placed in the search box are usually boldfaced in the search results. So, just what is a title tag, and why does it matter for search engine positioning?
According to the World Wide Web Consortium, the Title tag was designed to help people “identify the contents of a document.” When people view individual web pages out of context (often via search), context-rich page titles help tell the visitor a summary of the page. Instead of a title like “Introduction”, which doesn’t provide much contextual background, web designers should supply a title such as “Introduction to Medieval Bee-Keeping” instead. Google and other search engines use these rich contextual clues as a way to hone its search results. On a web page, the title tag is part of the HTML code.
Here’s what the code looks like on Customer Paradigm’s site: Customer Paradigm: Website Design, Development, Email Marketing, Content Management, PHP programming Most end users won’t see the title tag*. But if you remember back to my email tip about subject lines, the title tag is what a subject line is to an email campaign: It entices the end user to pay attention and open the page to read more.
Top Five Most Common Mistakes for Title Tags:
1. Untitled: When many of the popular programs create a new HTML page, it puts ‘Untitled’ into the title tag. It’s up to the Web designer to change this… and since most users don’t see it, sometimes they forget to change it.
2. No Title Tag: Like the “Untitled” tag, another key mistake is simply leaving out the title tag. If you do a view source (Internet Explorer: View —> Source), and the title tag appears like: <title></title> … then you don’t have a title tag.
3. “About” Tag: Another common mistake for title tags is to have the title tag refer to a section of your website. But a title tag that reads, “About” doesn’t tell me much about what the company or website is “About.” Instead, have it read: <title>Customer Paradigm – About the Company: Website Development & Marketing, Email Deployment, and PHP programming</title> This is sure to get more keywords into the title tag, and if you’re searching for a company, you instantly know what they do.
4. No Company Name In Title Tag: We recommend putting your company name at the beginning of the title tag, so that people can quickly see your company’s name when they search.
5. Same Title Tag on Multiple Pages: You should have a unique title tag for each page of the site. Why? As each page is unique, you should have a title tag that describes it’s unique content.
2. Don’t Confuse The Search Engines With Graphics
Search engines are really good at reading text. But they’re very easily confused. And if Google gets confused when it crawls through your site, you won’t rank very high in search results. Search engines, for example, can’t read words that are contained in graphics or flash animation. So if your company’s name is only contained in a graphic on your site, this content is ‘invisible’ to a search engine. Same thing goes for product or service names.
The root of the problem lies with graphic designers. Graphic designers are really good at building graphics. Don’t take this to mean I don’t like graphic designers (I employ a bunch of them), but they sometimes don’t know how to create SEO-friendly design. Most websites, however, are designed by graphic designers who are really good at building graphics, and less interested in Search Engine Optimization (SEO). It takes a bit more time to have content placed in text, and use a stylesheet to format it so that a search engine can read it. Especially when it’s so easy to create a good looking graphic in Photoshop. Here’s an example of a graphic:
Here’s how it could be formatted, instead, using text and a cascading stylesheet (CSS): See the title of this section (above) for a live example.
EMAIL MARKETING: Here’s an example of a site that uses all flash (and is invisible to search engines). While it looks pretty to humans, to Google the content is completely invisible.
Even if you’re not worried about organic search positioning, but are doing paid search engine marketing (like Google Adwords), it’s important that the content on your site is easily digested by a search engine. Why? Google Adwords ranks the pages on your website, and compares it to your keywords and ad copy. The more relevant Google ranks the text on your site, the less you’ll have to pay for a sponsored ad on Google (and the higher your position).
In Summary: Don’t confuse search engines by keeping your content ‘locked up’ in graphics. It’s a small little detail in the web design process, but one that will pay dividends for a long, long time with increased search results.
1. Local Search Engine Advertising
For businesses that serve specific geographic regions (i.e. Denver, Colorado), you can create search engine ads in Google and Yahoo that only appear to people in your area. How does this work? A search engine like Google uses a computer’s IP address and other information to discover where someone is searching (including city and state).
Why does Google care where a person is located?
Google’s mission is to give their end users the best search results possible. So, if I need someone to walk my dog in Ft. Lauderdale, Florida, it does me little good to receive a paid search result from Arizona. This is a real example — my brother has a petsitting business, and I’ve used local Google search engine advertising to drive new clients to his organization. Thus Google (and the others) tries to match search results to the geographic location of the person searching.
How does Google make money?
Google gives businesses and organizations the ability to display paid advertisements (sponsored results) on search results pages. These ads are triggered by keywords you choose (more on this in a different strategy). You don’t have to pay for your ad to display; you pay Google only when someone clicks on your ad. The technical term is Cost Per Click (CPC) advertising. The more relevant your ad (more on this later), the less you have to pay for specific keywords, and the higher up you will appear in the sponsored advertising results. Local Search CPC Ads. In Google Adwords, you can create an advertising campaign that will target someone in a specific city or state. You can even specify a 5, 10 or 25 mile radius from a specific location (like your retail showroom or office). Below your local ad, Google will place the name of your local area (i.e. Denver, Colorado)… making it more likely that someone searching in your area will choose your organization vs. an out-of-town competitor.
Local CPC Ads are usually a more cost effective option than a national search engine advertising campaign. As a general rule of thumb, the more geographically targeted and specific you can be, the less money you’ll need to pay to acquire new customers. And make sure you have conversion tracking code placed on your site, so you can measure and track how much you’re paying for each new customer via local search engine advertising.
If you are interested in Google’s update to Google Places read this article. Google Local
If you need any help with your online CPC campaign Customer Paradigms SEO team will efficiently organize and manage your CPC campaigns.
SEO Calculator – Conversion to Purchase (eCommerce)
|If you were ranked…||% Clickthrough||# of Visitors||Sales / Month||Lifetime Value|
Explanation: If you rank #1 for a search term that receives 25,000 monthly searches, you will receive 42% of the clickthroughs (on average). If your average customer engagement is $5,000, and you convert 5% of the people who land on your site into leads, and close 25% of those into sales, then you should expect $656,250 in MONTHLY sales from that #1 ranked SEO term. If each customer works with you on 1.75 projects over the course of their purchasing lifetime with you, you’ll generate $1,804,688 in total lifetime sales, based on one month of searches for that term. SEO ranking really matters!