These are BLS CODES are officially recognized as USA.CODES of
Advertising and promotions managers
What Advertising, Promotions, and Marketing Managers Do
Advertising, promotions, and marketing managers typically do the following:
- Work with department heads or staff to discuss topics such as budgets and contracts, marketing plans, and the selection of advertising media
- Plan promotional campaigns such as contests, coupons, or giveaways
- Plan advertising campaigns, including which media to advertise in, such as radio, television, print, online media, and billboards
- Negotiate advertising contracts
- Evaluate the look and feel of websites used in campaigns or layouts, which are sketches or plans for an advertisement
- Initiate market research studies and analyze their findings to understand customer and market opportunities for businesses
- Develop pricing strategies for products or services marketed to the target customers
- Meet with clients to provide marketing or related advice
- Direct the hiring of advertising, promotions, and marketing staff and oversee their daily activities
Advertising managers create interest among potential buyers of a product or service. They do this for a department, for an entire organization, or on a project basis (referred to as an account). Advertising managers work in advertising agencies that put together advertising campaigns for clients, in media firms that sell advertising space or time, and in organizations that advertise heavily.
Advertising managers work with sales staff and others to generate ideas for an advertising campaign. They oversee the staff that develops the advertising. They work with the finance department to prepare a budget and cost estimates for the campaign.
Often, advertising managers serve as liaisons between the client and the advertising or promotion agency that develops and places the ads. In larger organizations with extensive advertising departments, different advertising managers may oversee in-house accounts and creative and media services departments.
In addition, some advertising managers specialize in a particular field or type of advertising. For example, media directors determine the way in which an advertising campaign reaches customers. They can use any or all of various media, including radio, television, newspapers, magazines, the Internet, and outdoor signs.
Advertising managers known as account executives manage clients’ accounts, but they are not responsible for developing or supervising the creation or presentation of advertising. That task becomes the work of the creative services department.
Promotions managers direct programs that combine advertising with purchasing incentives to increase sales. Often, the programs use direct mail, inserts in newspapers, Internet advertisements, in-store displays, product endorsements, or special events to target customers. Purchasing incentives may include discounts, samples, gifts, rebates, coupons, sweepstakes, or contests.
Marketing managers estimate the demand for products and services that an organization and its competitors offer. They identify potential markets for the organization’s products.
Marketing managers also develop pricing strategies to help organizations maximize their profits and market share while ensuring that the organizations’ customers are satisfied. They work with sales, public relations, and product development staff.
For example, a marketing manager may monitor trends that indicate the need for a new product or service. Then he or she may assist in the development of that product or service and to create a marketing plan for it.
SEO, Organic Search, Google Adwords, Deep Crawl, HTML, Botify, SEO, Optimization, Marketing Principles, Conductor Searchlight, Search Engines, Google , Yahoo, Bing, Yandex, Baidu, Marketing, Online Marketing
You will play a key role as SEO Specialist in the growing Audience Growth team, one of the world’s leading publishers. Specifically, you will work across the breadth of consumer media brands.
SEO Specialist Job Description
What is an SEO Specialist?
The mission of an SEO (Search Engine Optimization) Specialist is to maximize the volume of inbound organic traffic from search engines to a website. This is accomplished through a combination of on-page and off-page techniques, including link-building, social media strategy, viral marketing, metadata sculpting, site speed optimization, content strategy, information architecture, and more.
As a result of the daily changes in the search algorithms of Google, Bing, Yahoo, and other leaders in search, an SEO expert works in a dynamic environment that requires them to be continually learning, fine-tuning their skills, and experimenting to discover how the industry is shifting.
There’s no college program to train an SEO, few industry standards, and many misconceptions and outdated ideas connected to the industry. Professionals in this industry break into their positions through experience, success, research, and trial and error.
At Jobfindjobs Staffing, we specialize in placing SEOs in excellent positions throughout the USA — from Silicon Valley to Silicon Alley and beyond. If you’re seeking work in the industry, we invite you to browse our jobs or apply online to have one of our specialists contact you today!
The SEO’s skill set
- Experience with Google and Bing’s services, including Analytics and Webmaster Tools
- Experience with Google’s Keyword Tool
- A functional understanding of HTML and CSS
- The ability to work with back-end SEO elements such as .htaccess, robots.txt, metadata, site speed optimization, and related skills
- Proven success in link building and viral strategies
- The ability to deploy an effective local and long-tail search strategy
- A deep understanding of mobile strategy and how it relates to SEO
- A solid grasp of how blogging, press releases, social media, and related strategies go hand-in-hand with SEO
- Experience in building inbound organic search traffic and improving SERPs
- A background in creating reports showing web analytics data and site evaluations
- An up-to-date, working knowledge of current, past, and projected trends in the SEO industry
- Familiarity with the best tools in the trade
Related job titles: SEO (Expert, Associate, Coordinator, Specialist, Analyst, Strategist, Marketing Pro), Search Engine Optimization Specialist, Organic Search Manager, Local Search Strategist, SEO Content Strategist/Writer
Related certifications & qualifications
There are many SEO certifications available online, but the general consensus is that a large number of them are not widely accepted within the SEO community. However, the certifications provided by Google are viewed in a positive light, and there are a number of SEO certifications that are well respected in many circles. Below is an incomplete list of a few certifications we recommend:
- GAIQ (Google Analytics Individual Qualification) Certified »
- Google Adwords Certified »
- MarketMotive SEO Master Certification »
- SEMPO: Advanced SEO Certification »
Three types of SEO Specialists
People working in search optimization generally work in one of the three categories: In-house, Agency, or Freelance. For both the employee and the employer, each category comes with its own set of implications, as well as pros and cons. Before accepting work, the SEO will have to decide if their job type will fit their lifestyle and personal preferences. One safe way to navigate this decision is to accept a few temp jobs in SEO, so you can try out a few different work environments and see what's right for you.
An in-house SEO works as a full-time employee for a single company, or underneath the umbrella of several sister companies. You’ll be able to become specialized within a niche industry, have greater ownership over the projects at hand, and will be able to take full credit for your brilliant successes (or failures) along the way. Do a great job, and you'll earn a solid, stable job, and a respected name for yourself among your industry peers.
A company generally hires an in-house SEO to work exclusively on a small set of websites, choosing an in-house over an agency so they can have a more hands-on, responsive approach to SEO. They appreciate the SEO's in-depth understanding of their specific industry — an understanding that’s hard for an agency to guarantee.
As an in-house SEO, you may be the only one with strong SEO knowledge in your company. This means that the burden of educating and "selling" your team on SEO is on you. You will need to be good at communicating, working with corporate road blocks, and eliminating the fear of risk (or instilling them with a little fear, as the case may be), along the way. If they don't get on board with your strategy, it could definitely slow you down.
Agency SEO Specialist
When an SEO works in an agency, they enjoy the fast-paced learning immersion that a team of related professionals enjoys — a team that they can quickly add to their professional network. Their resume builds quickly with case studies from high-profile brands and companies, and they’re able to build their success on a time-tested groundwork of best practices and a history of data-driven success.
A company will generally work with an agency when they’re looking for a low-cost, "safe" option. They’re drawn to the wider base of experience — they’re not gambling on the skills of a single professional.
Despite all the glamor, agencies often have a high turnover, as SEOs are poached by competing companies and lost clients can lead to employees being phased out. While the pay is solid and the benefits are good, there’s a "Live by the sword, die by the sword" philosophy, and an inherent stress level with it that may not be right for you.
A freelance SEO takes on a much more dramatic set of challenges and opportunities than an in-house- or agency-employed professional. You’re your own boss, and you’re completely free to choose your work, choose your salary, and choose your benefits.
On the other hand, you’re the biggest fish in a pond of one. If you’re stuck, it’s up to you to dig yourself out. You’re going to have to do all your own marketing, negotiating, and networking on your own — and the companies that come to you are often going to be looking for a "deal". You're going to have to deal with an erratic workflow, and you’re going to really need to know your stuff.
Some of the biggest challenges that a freelance SEO encounters is in communicating the value (read: cost) of SEO for a website, establishing realistic expectations (and a time frame for those expectations) for the client, and operating in an environment where you’re essentially a "wizard for hire", whose results are magical and hard to measure, and operate within a mutable universe.
SEO Network is a family of companies that takes a people-first approach to financial services, using technology to empower consumers to overcome debt and create a brighter financial future. The company was founded in 2001 by Andrew and Houssein on the belief that by staying committed to helping people, you can ensure better financial outcomes for both the customer and the business. This Heart/$ philosophy still guides the vision of our growing company, which has helped millions of people find solutions for their financial needs.
What began with 3 people in a spare bedroom has now rapidly expanded to a vibrant business that employs over 19600 employees (known internally as The SEO NETWORK) in two locations: New York and Washington DC. When you visit either of our offices, you’ll understand why our employees have voted us the Best Place to Work for the last several years. It’s a place where the Heart/$ philosophy continues to thrive, where we believe that success is only achieved by doing what’s right for our customers, our employees, and our communities.
The SEONetwork is looking for a SEO Manager to improve our organic traffic and shape our future SEO strategy. This is a hands-on search engine optimization position where you will use SEO tactics and analytical skills to understand and implement best practices across all of The SEONetwork's web properties. The role includes a great deal of cross-team collaboration with marketing, product, and design team members.
- Develop and execute quality inbound link strategy
- Conduct keyword research to guide content strategy
- Monitor SEO efforts and conduct routine technical SEO audits across all company sites
- Design experiments to test new strategies and opportunities for organic traffic growth
- A/B test both layout and copy to increase conversion
- Assist with on-page optimization by coordinating with copywriters, web design and development teams
- Keep abreast of SEO and integrated marketing trends
- 2-5 years’ experience with SEO
- Knowledge of HTML/CSS
- Familiarity with relevant SEO and web analytics tools (e.g. Ahrefs, Screaming Frog, MOZ, Google Search Console, Google Analytics etc.)
- Mastery in SEO fundamentals like page titles, meta tags, headers, URL structure, etc.
WHY JOIN THE FREEDOM FAMILY?
- Fast, continued growth – there’s a lot of opportunity for advancement
- Voted a Best Place to Work multiple times by our employees
- Benefits start within 30 days
- 401k with employer match
- 3 weeks’ paid vacation (increased with tenure)
- 9 paid holidays & 5 sick days
- Paid time off for volunteer work and on your birthday
This is your opportunity to be part of a growing company where dedicated professionals strive to help customers and each other succeed every day. If that sounds exciting to you, we want to talk to you. Apply today!
A career in SEO (search engine optimisation) is a good choice for you if you possess creative flair and good analytical, technical and marketing skills
As an SEO specialist you will identify strategies, techniques and tactics to increase the number of visitors to a website and obtain a high-ranking placement in the search results page of search engines.
Those working in SEO may also be known as online marketers or digital account executives. This type of work may be referred to as content marketing or conversion rate optimisation work.
As an SEO specialist, you'll carry out some or all of the following:
- run pay-per-click (PPC) campaigns to increase revenue through sponsored online advertising;
- write original content for webpages - tailored for the target audience;
- make technical recommendations to developers;
- constantly check search terms, rankings and analytics to monitor performance of client websites and make recommendations for improvement;
- engage with other businesses (affiliates) over link-building opportunities, to bring in paid commission for driving customers to their websites;
- employ user experience (UX) and conversion rate optimisation methods to turn visitors into more active users and to help improve website performance;
- community building - to drive targeted and loyal traffic to a website;
- use social media to distribute content and encourage more external sites to link to your content - giving you more authority;
- develop and integrate content marketing strategies;
- monitor the algorithms set by the search engines to keep up to date with changes.
- Starting salaries for SEO specialists are normally in the range of £18,000 for trainees, but can be as high as £28,000 at an agency.
- With more experience, executive and management roles can attract salaries of £25,000 to £35,000.
- Top in-house positions in London generally pay higher than agencies, with average salaries for head of department positions attracting around £47,000 and Director-level positions an average of £130,000.
Experienced freelancers can command high levels of pay.
Income figures are intended as a guide only.
Working hours are normally 9am to 5pm. You may need to work occasional evenings and weekends, for example, if you are involved in analysing traffic for a large marketing campaign and there is a looming deadline.
What to expect
- SEO work requires you to constantly review your skills and keep up to date with ever changing technological developments and trends. Although there is a lot of competition in the traditional SEO market, the sector is booming and it is predicted that SEO experts will always be in demand.
- You will need to be flexible as you may be working on a range of areas from programming to writing original content for webpages. However, you will often find that you have a lot of control over how you manage your work load. It will be necessary to invest time and effort into link-building strategies and to ensuring that you keep up to date by reading SEO and internet marketing blogs.
- You can expect to find SEO roles in large enterprises with marketing departments and these tend to be based in cities and towns throughout the UK. Digital marketing and media agencies also tend to be located in the major cities. Agencies which operate globally are likely to have opportunities to work abroad. Self-employment as a freelance SEO consultant is possible for experienced SEO specialists.
- Travel may be required especially in freelance and agency roles where visits to clients are an essential part of the job.
- A professional dress code can be expected on client sites. However, there may be a more relaxed approach in SMEs and digital marketing agencies.
Related case studies
Technology company director
Head of search (SEO)
This area of work is open to all graduates but a degree in a related area such as IT, business and technology or marketing may be particularly useful.
Entry into graduate-level SEO/marketing roles with an HND, or foundation degree is possible but you will need to have substantial, relevant experience and technical skill.
Postgraduate qualifications in digital marketing are offered by an increasing number of UK universities and although not essential, may be helpful for expanding your skills and knowledge, particularly if you have a non-related first degree. Search for postgraduate courses in digital marketing.
A range of qualifications in SEO and digital marketing are offered by the:
- Chartered Institute of Marketing (CIM)
- The Institute of Direct and Digital Marketing (IDM)
- Institute of Practitioners in Marketing (IPA) - Search Certificate
Other training providers offer short certified courses and these include:
- Emarketeers SEO Foundation Training Course
- Jellyfish SEO Training Courses
You will need to show:
- a genuine interest in SEO and/or digital marketing;
- evidence of a strong personal online profile, e.g. a blog, website or connections and interactions on various social media platforms;
- a commitment to keeping your technical skills and knowledge up to date;
- an inquisitive mind which drives you to understand Google's algorithms and predict what changes might be coming;
- a good understanding of search engines;
- an ability to understand social media platforms and how to use them to distribute content, gain more links and build successful campaigns;
- a good understanding of PR and how to get the right messages out there;
- an understanding of how customers search, where they search and why they purchase something online;
- a good understanding of the wider marketing context;
- strong analytical skills for understanding ranking algorithms;
- excellent communication skills to educate, inform, manipulate and mediate across a number of stakeholders in an organisation;
- proficiency in Microsoft Office and an excellent knowledge of Excel;
- some experience and knowledge of webhosting.
Though highly competitive, the best way to gain good experience in SEO is to undertake an undergraduate SEO/digital marketing placement or internship, with a large graduate recruiter. If you would like to get some work experience locally, try speculatively contacting digital marketing agencies, charities and the marketing departments of companies in your area. Any work experience where you are helping to improve an organisation's website is good experience.
To develop your digital skills, consider setting up your own website or blog, which as well as showing your commitment, interest and initiative, will give you something to discuss at interviews. You can find advice on how to get into SEO work, including information about various digital marketing career paths, at Digital Marketing Career Zone.
Keep up to date with industry news, jobs, recruitment and networking events by following digital marketing companies on Twitter and reading their blogs. Relevant professional organisations can help, for example the IDM and the IPA have blogs which cover key topics and developments in digital marketing.
Most companies and organisations with an online presence, especially those selling products and services, will require some level of help with their SEO. For this reason SEO specialists are very much in demand, but you will need to be creative in your job search.
Positions in SEO may be called SEO specialist, but they are also likely to be called:
- online marketing executive;
- content marketing officer;
- digital marketing executive;
- media and design executive.
Large corporates will employ SEO specialists within their marketing departments. SMEs are also likely to employ SEO specialists but you'll probably have a wider marketing remit. Many SEO specialists choose to go freelance, or work in a small consultancy.
For entry into this type of work you could look at entering a digital marketing agency to get experience before moving on.
Look for job vacancies at:
- Brand Republic Jobs
- Chinwag Jobs
- SEO Jobs
- The Drum Jobs
Details of marketing work placements and graduate jobs are also available at CIM's getin2marketing: Work Placement Resource. Specialist recruitment agencies such as these, advertise SEO vacancies, but you will need to prove that you have some knowledge and experience:
- Clockwork Talent
- Opilio Recruitment
- Orchard - covers the North of England.
Once in the role of SEO specialist you will usually receive on-the-job training in the digital marketing tools used by the company. These vary but, for example, could include:
- Adobe Creative Suite for online banner design;
- Google AdWords for pay-per-click campaigns;
- Google Analytics and Facebook Insights for web analytics reporting.
Many companies provide financial support to undertake professional qualifications such as those offered by the CIM. These include:
- CAM Level 4 Diploma in Digital Marketing;
- Level 4 Certificate in Professional Marketing (Digital);
- Level 6 Diploma in Professional Marketing (Digital).
For more information see CIM Qualifications.
The IDM also has professional qualifications, including:
- Professional Diploma in Digital Marketing;
- Professional Diploma in Direct and Digital Marketing.
For a full list of available qualifications see the IDM Professional Marketing Qualifications.
The training offered in a smaller organisation is likely to be practical and intensive with more wide-ranging responsibilities early on. This may suit some graduates better than working for a larger company on a graduate training scheme.
Continuing professional development (CPD) is important in SEO work due to the fast-moving developments in emerging technologies. It is possible to take a range of short courses to help with this. The CIM, IDM and IPA offer courses on digital marketing topics such as:
- affiliate marketing;
- social and digital metrics and analytics;
- pay-per-click (PPC);
- search engine optimization (SEO).
These are designed to fill any gaps in your knowledge and to update you on new developments in digital marketing tools and platforms.
Other training courses specifically covering SEO training from beginner level to advanced include those offered by:
- Jelly Fish
As an SEO specialist, various career paths are open to you. You will generally start your SEO career in a junior role on an internship and then move to SEO executive. As your career progresses you may become an SEO manager or account manager, then head of SEO to head of digital. However, you could equally move from SEO executive to a copywriter role and move into a broader digital marketing route earlier on.
Progression to more experienced roles such as SEO manager could happen within two to five years if you work for a company which offers on-the-job training and encourages CPD. There is an increasing requirement to gain further qualifications to enter senior marketing roles.
If you reach digital account director level, you will have overall responsibility for managing accounts, strategy and digital marketing campaigns. You will take on additional responsibilities such as budgets and training and mentoring junior members of the team.
Once you have experience and can show that you meet the specified criteria, which includes a certain level of CPD, you can achieve chartered status with the CIM. This can help with career progression and further information is available at CIM: Chartered Marketer Status.
Prepare for these 15 job interview questions
The best way to approach a job interview for an SEO position is to come in prepared with a solid understanding of the company’s existing websites and documented suggestions as to how you would improve their SEO on the short-term and long-term. Additionally, be prepared to tackle these challenging questions:
- Which SEO tools do you use?
- Which industry thought leaders do you follow in SEO?
- In your previous roles, what metrics did you track, and what kinds of reporting did you provide?
- How have you adapted your SEO style to align with emerging best practices and the recent changes in the Google algorithm?
- What is your mobile strategy?
- How do you approach an SEO-friendly content strategy?
- What do you feel is the three most important elements to on-page SEO?
- What do you feel is the three most important elements to off-page SEO?
- What actions do you recommend to integrate Social Media into your overall SEO strategy?
- How would you describe your link-building process? Do you have a viral strategy?
- How will you integrate your SEO knowledge in with our creative and design team?
- What are some examples of how you've brought an increase in organic traffic in your previous organic search campaigns?
- Which peripheral skills (coding, html/css, content strategy, IA, etc.) do you have that can supplement your SEO skills?
- When would you use robots.txt vs. a robots meta tag?
- How would you apply an SEO strategy for a client who had a website programmed in Flash?
Brush up on your SEO knowledge
Search engines are evolving — to stay relevant within the industry, an SEO must build a culture of ongoing learning into their lifestyle. And as SEO becomes increasingly social, there’s a lot of value to building a strong professional network — both in person and within the digital narrative. To this end, we recommend that anyone in SEO check out the following blogs and events.
SEO Conventions & Training Events:
Apply for a SEO job with NY Staffing
If you’re looking for the next step in your career, Paladin Staffing is ready to connect you with an excellent SEO opportunity in your area. Launch the next stage of your career by browsing our job listings or apply online to connect with one of or recruiters today!
Creating brilliant work is a rewarding experience. But we think you deserve a little more than that. That's why we offer exemplary benefits to our associates, including the following:
- Medical and dental coverage: We offer a variety of combinations to suit your needs and your budget.
- 401(k) savings: Don’t hold back your brilliance, rather hold on to some cash for retirement.
- Direct deposit: You want to get paid as quickly and simply as possible. We make it happen.
- Longevity bonuses: We want you to be a part of the Paladin team for a long time, and we’ll make it worth your while.
- Paid PTO and holidays: You enjoy what you do, but you would probably enjoy some time off, too. You can earn paid time off for holidays and vacations.
SEO Specialist: Job Description
- A Search Engine Optimization(SEO) Specialist analyzes, reviews and implements changes to websites so they are optimized for search engines. This means maximizing the traffic to a site by improving page rank within search engines.
- Simply put, in the words of SEO, “it is the job of the SEO specialist to make your website show up at the top of the search engine results. Ten years ago that job looked a lot different than it does now, and it requires a whole new skill-set from what was needed back then.
- A modern specialist must be a problem solver and decision maker, with the ability to prioritize and develop relevant and engaging content. You know the old adage, “Content is king?” well, modern SEO specialists know that search engines are placing increasing value on quality content – which will invariably include keyword or phrases that increases traffic to a site.
- They may also test and implement testing various search engine marketing techniques, web site layouts and advertising for search engine optimization. They also know the importance of internal links and the ability to problem-solve comes in handy when trying to find the best locations and the best approach to internal links.
- An SEO Specialist will analyze websites for improvements, have an in-depth knowledge of keyword research, understand SEO copywriting and serve a liaison between various departments.
- Modern SEO specialists know that search engines are placing increasing value on quality content – which will invariably increase traffic to a site.
- A degree and a minimum of one to three years of web experience is required for the SEO Specialist position, including knowledge of HTML, CSS, programming language and blogging. A good summation of what the SEO Specialist really does can be found here:
Search Engine Optimization (SEO) Specialist job description
This Search Engine Optimization (SEO) Specialist job description template is optimized for posting on online job boards or careers pages. It’s easy to customize this SEO job description with key duties and responsibilities for your company or agency. Similar job titles include SEO Analyst, Strategist, Manager and Consultant.
SEO Specialist Responsibilities
- Optimizing copy and landing pages for search engine optimization
- Performing ongoing keyword research including discovery and expansion of keyword opportunities
- Researching and implementing content recommendations for organic SEO success
Hiring an SEO specialist?
We are looking for an SEO/SEM expert to manage all search engine optimization and marketing activities.
You will be responsible for managing all SEO activities such as content strategy, link building and keyword strategy to increase rankings on all major search networks. You will also manage all SEM campaigns on Google, Yahoo and Bing in order to maximize ROI.
- Execute tests, collect and analyze data and results, identify trends and insights in order to achieve maximum ROI in paid search campaigns
- Track, report, and analyze website analytics and PPC initiatives and campaigns
- Manage campaign expenses, staying on budget, estimating monthly costs and reconciling discrepancies.
- Optimize copy and landing pages for search engine marketing
- Perform ongoing keyword discovery, expansion and optimization
- Research and implement search engine optimization recommendations
- Research and analyze competitor advertising links
- Develop and implement link building strategy
- Work with the development team to ensure SEO best practices are properly implemented on newly developed code
- Work with editorial and marketing teams to drive SEO in content creation and content programming
- Recommend changes to website architecture, content, linking and other factors to improve SEO positions for target keywords.
- Proven SEO experience
- Proven SEM experience managing PPC campaigns across Google, Yahoo and Bing.
- Solid understanding of performance marketing, conversion, and online customer acquisition
- In-depth experience with website analytics tools (e.g, Google Analytics, NetInsight, Omniture, WebTrends)
- Experience with bid management tools (e.g., Click Equations, Marin, Kenshoo, Search Ignite)
- Experience with A/B and multivariate experiments
- Knowledge of ranking factors and search engine algorithms
- Up-to-date with the latest trends and best practices in SEO and SEM
- BS/MS degree in a quantitative, test-driven field
Search Engine’s Guide To SEO
As a companion to the table, Search Engine’s Guide To SEO explains the ranking factors in more depth, in a tutorial providing tips and advice on implementing them.
Links to the entire guide are shown below (start at the beginning, and each page will take you to the next):
- Chapter 1: Types Of Search Engine Success Factors
· TYPES OF SEARCH ENGINE SUCCESS FACTORS
- Search engine basically depicts the factors to gaining visitors to the website.
- The Search engine Land’s Periodic Table Of SEO Success Factor basically covers three main groups:
- On the page SEO
- Off the page SEO
- Each group has a subgroup further explaining the SEO Guide and its importance. All the factors work together with the periodic table. The first letter of each SEO element is derived from the subgroup and the second letter from the individual factor.
· SEO FACTORS WORK IN COMBINATION
- One cannot guarantee search engine ranking with a single SEO Factor. Every factor is an important starting from quality content to the HTML title and to authority. Presence of positive factors result in an increment of the odds of success and negative factors imply worsening these odds.
· ON THE PAGE RANKING FACTORS
- These factors are under the control of the publisher only. It includes decisions like the content to publish, determining relevancy, or providing important HTML Clues for help in search engines.
- To get in detail with Onpage SEO strategies click the below link
- “Onpage SEO Strategies made easy”
· OFF THE PAGE RANKING FACTORS
- These are in use by search engines because they rely on the fact that signals in control by publisher does not always result in better results, hence of the page ranking factors are not controlled by the publisher.On the page rankings are not sufficient enough to sort through billions of web pages. Other signals are needed as well.
- To go deep into Off page SEO click the below link
- “OffPage SEO Optimization”
· SEO VIOLATIONS AND RANKING PENALTIES
- The results of search engine improve by the performance of people.
- Search engine turns as a great help by providing blog posts, guidelines, and videos to encourage the techniques of SEO.
- Factors are important as well but they gain less importance in comparison with other factors of the chart. Weighting is a combination of search engines, surveys, expertise and experience.
· MISSING SEO FACTORS AND THE GUIDE’S PHILOSOPHY
- The main focus of the SEO Periodic table and the guide revolves around helping those new in the business of SEO and require experience. This is the only reason behind SEO not addressing any important keywords neither in the beginning nor the end of HTML title tag.
- Just because these things are distracting and pulls down to the rabbit hole, they avoid being ultra specific. They help others gain the knowledge of good ideas.
- Also, social stuffs like Facebook and Twitter help achieve such success. All the social accounts attract a great mass and help gain reputation and sharing.
Search Engine Optimization Success Factors Explained
If you have a site or blog, you are most likely familiar with the basics of search engine optimization. SEO is essentially the techniques that you use to get higher rankings among search engines organically. There are a number of different strategies that you can try from link building to using keywords. The possibilities really are endless when you begin using effective SEO techniques. However, the only way to assure that you have the most success and see the desired results is to be completely aware of SEO success factors. This means that you need to know what matters most when it comes to SEO techniques and what has the greatest influence over your site or blog achieving more visibility online. It is essential that you understand how Search Engine Optimization factors work in combination and what techniques you should focus more of your efforts on. Having a greater understanding of SEO success factors will enable you to create an SEO strategy that offers the best results. Here is more information on SEO success factors and how they influence the strategies that you should choose:
It is always important to note that there is not one single Search Engine Optimization factor that stands out from the rest. This means that not one factor will guarantee success above all others. There is no search engine success guarantee when it comes to SEO. This is explained by focusing on the use of an HTML title as effective SEO strategy. Creating a great HTML title offers SEO potential, but it is all for not if the HTML title is displayed on a page with low quality content. This means that there really are no shortcuts in the world of SEO. This is true for link building and other SEO strategies that are known to work. If you do not make sure that the quality of content is high and valuable to online readers, you will not have success. Having more than one positive factor in your side will improve your chances of success. This means that if you have a great HTML title combined with great links and content that is of great quality, your odds of success will increase drastically. SEO factors work in combination and this is the way that you need to approach your SEO strategy.
It is important to understand that there are different types of SEO factors for success that you need to be aware of. They can be categorized in a number of ways and one of these categories is referred to as on-page. The on-page search ranking factors are the ones that you have the most control over. This means that these are the factors that you should concern yourself with most, because you have the ability to change the results. The on-page success factors that you can control involve the type of content that you choose to create and publish and the on-site SEO techniques that you choose to implore. This also involves the type of HTML clues that you leave on your site to help search engines gauge the relevancy of your site and the overall design of your site and blog. All the on-page Search Engine Optimization factors are what you have control over. This means that you carefully need to plan out how you are going to use these on-page factors to your advantage to gain higher visibility among search engines.
While you have control over the on-page Search Engine Optimization success factors, it is important to understand more about off-page success factors and how they work. These are the success factors that you can’t directly control/ This means that using SEO stratifies becomes even more difficult when it comes to off-page SEO factors. The reason that search engines choose to rely on off-page factors is because they do not totally want to depend on the publishers of content for rankings. This means that some of the factors that determine search engine ranking are not left up to site publishers. Controlled signals on a site can be used to make a site or blog seem more relevant than it really is, which is why search engines do not trust on-page factors alone. On-page clues are not enough for most search engines and in order to get a proper search ranking off-page factors are also included.
Penalized for SEO
It is widely known that search engines do reward sites and blogs for Search Engine Optimization. This means that using SEO techniques is a great way to gain visibility online. However, it is also important to understand that there are some techniques that are categorized by search engines as black hat or spam. These are the techniques that you need to shy away from because they will not help you get a higher search engine ranking and will only hinder your efforts. Before you implore SEO techniques you need to learn about the strategies that are frowned upon and penalized.
I. Introduction – What Is SEO
Whenever you enter a query in a search engine and hit 'enter' you get a list of web results that contain that query term. Users normally tend to visit websites that are at the top of this list as they perceive those to be more relevant to the query. If you have ever wondered why some of these websites rank better than the others then you must know that it is because of a powerful web marketing technique called Search Engine Optimization (SEO).
SEO is a technique which helps search engines find and rank your site higher than the millions of other sites in response to a search query. SEO thus helps you get traffic from search engines.
This SEO tutorial covers all the necessary information you need to know about Search Engine Optimization - what is it, how does it work and differences in the ranking criteria of major search engines.
- How Search Engines Work
The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.
First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified, sometimes crawlers may not end up visiting your site for a month or two.
After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.
When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.
There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or metatags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, Bing, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.
The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.
- Differences Between the Major Search Engines
Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. For different search engines different factors are important. There were times, when SEO experts joked that the algorithms of Bing are intentionally made just the opposite of those of Google. While this might have a grain of truth, it is a matter a fact that the major search engines like different stuff and if you plan to conquer more than one of them, you need to optimize carefully.
There are many examples of the differences between search engines. For instance, for Yahoo! and Bing, on-page keyword factors are of primary importance, while for Google links are very, very important. Also, for Google sites are like wine – the older, the better, while Yahoo! generally has no expressed preference towards sites and domains with tradition (i.e. older ones). Thus you might need more time till your site gets mature to be admitted to the top in Google, than in Yahoo!.
- Chapter 2: Content & Search Engine Success Factors
Content and Search Engine Success Factors of a Website
Content and Search Engine Success Factors of a Website
Content is called the king of every post; it’s most important element of SEO, because a combination of search engine optimization and content marketing is an ideal recipe for success. SEO content should be unique, helpful and relevant to your post/website. Google prefers quality content. With the help of quality content you can get a desire rank of your SEO. The SEO content is ideal when it provides to both human readers and search engine crawlers. Quality content with best keywords can optimize your website and also can increase your targeted traffic by improving your searches. Content and search engine are the success factors of a website
Content and Search Engine Success Factors of a Website
Search engines require fresh and relevant content to give you a position on the first page. If your content is efficient search engines will prefer it. When you start writing the first main thing to know what your topic is about? Which type of information you have required and which type of stuff your audience like? You should also consider the content strategy which your competitors are implementing.
As google ranks the good authority sites, so to achieve this you must produce a lot of useful and engaging content on your site which should be related to your topic. When you post a relevant content google easily identifies the topic you have authority in. For a quality content to get a position on the first page of search engine you need to follow these steps:
- When you start writing first identify the keywords which should be related to your topic.
- Optimize your title.
- Use natural keywords, don’t use overstuff keywords.
- Link the page to the other relevant page in your website or other related websites.
Today SEO has changed a lot. The latest SEO strategies by following which you can get your desire rank in search engines. If you follow these SEO strategies it will improve the user experience but if you don’t follow these SEO strategies your all efforts are useless. You cannot maintain your position in search engines.
Search engines do reward sites and blogs for SEO. It means by following SEO techniques you can gain or increase your visibility online. There are some SEO techniques which are categorized as black hat techniques by search engines. These are the techniques you need to avoid because these techniques will not help you to get a higher search engine ranking.
II. Keywords – the Most Important Item in SEO
Keywords are the most important SEO element for every search engine, they are what search strings are matched against. Choosing the right keywords to optimize for is thus the first and most crucial step to a successful SEO campaign. If you fail on this very first step, the road ahead is very bumpy and most likely you will only waste your time and money. There are many ways to determine which keywords to optimize for and usually the final list of them is made after a careful analysis of what the online population is searching for, which keywords have your competitors chosen and above all - which are the keywords that you feel describe your site best.
- Choosing the Right Keywords to Optimize For
It seems that the time when you could easily top the results for a one-word search string is centuries ago. Now, when the Web is so densely populated with sites, it is next to impossible to achieve constant top ratings for a one-word search string. Achieving constant top ratings for two-word or three-word search strings is a more realistic goal.
For instance, If you have a site about dogs, do NOT try and optimize for the keyword "dog" or "dogs". Instead you could try and focus on keywords like "dog obedience training", "small dog breeds", "homemade dog food", "dog food recipes" etc. Success for very popular one-two word keywords is very difficult and often not worth the trouble, it's best to focus on less competitive highly specific keywords.
The first thing you need to do is come up with keywords that describe the content of your website. Ideally, you know your users well and can correctly guess what search strings they are likely to use to search for you. You can also try the Website Keyword Suggestions Tool below to come up with an initial list of keywords. Run your inital list of keywords by the Google keyword Suggestion tool, you'll get a related list of keywords, shortlist a couple of keywords that seem relevent and have a decent global search volume.
When choosing the keywords to optimize for, you need to consider not only the expected monthly number of searches but also the relevancy of these keywords to your website. Although narrow keywords get fewer searches they are a lot more valuable than generic keywords because the users would be more interested in your offerings. Lets say you have a section on your website where you give advice on what to look for when adopting a dog. You might discover that the "adopt german shepherd" keyphrase gives you better results than a keyword like "german shepherd dogs". This page is not of interest to current german shepherd owners but to potential german shepherd owners only. So, when you look at the numbers of search hits per month, consider the unique hits that fit into the theme of your site.
- Keyword Density
After you have chosen the keywords that describe your site and are supposedly of interest to your users, the next step is to make your site keyword-rich and to have good keyword density for your target keywords. Keyword density although no longer a very important factor in SEO is a common measure of how relevant a page is. Generally, the idea is that the higher the keyword density, the more relevant to the search string a page is. The recommended density is 3-7% for the major 2 or 3 keywords and 1-2% for minor keywords. Try the Keyword Density Checker below to determine the keyword density of your website.
Although there are no strict rules, try optimizing for a reasonable number of keywords – 5 or 10 is OK. If you attempt to optimize for a list of 300, you will soon see that it is just not possible to have a good keyword density for more than a few keywords, without making the text sound artificial and stuffed with keywords. And what is worse, there are severe penalties (including ban from the search engine) for keyword stuffing because this is considered an unethical practice that tries to manipulate search results.
- Keywords in Special Places
Keywords are very important not only as quantity but as quality as well – i.e. if you have more keywords in the page title, the headings, the first paragraphs – this counts more that if you have many keywords at the bottom of the page. The reason is that the URL (and especially the domain name), file names and directory names, the page title, the headings for the separate sections are more important than ordinary text on the page and therefore, all equal, if you have the same keyword density as your competitors but you have keywords in the URL, this will boost your ranking incredibly, especially with Yahoo!.
- Keywords in URLs and File Names
The domain name and the whole URL of a site tell a lot about it. The presumption is that if your site is about dogs, you will have “dog”, “dogs”, or “puppy” as part of your domain name. For instance, if your site is mainly about adopting dogs, it is much better to name your dog site “dog-adopt.net” than “animal-care.org”, for example, because in the first case you have two major keywords in the URL, while in the second one you have no more than one potential minor keyword.
When hunting for keyword rich domain names, don't get greedy. While from a SEO point of view it is better to have 5 keywords in the URL, just imagine how long and difficult to memorize the URL will be. So you need to strike a balance between the keywords in the URL and site usability, which says that more than 3 words in the URL is a way too much.
Probably you will not be able to come on your own with tons of good suggestions. Additionally, even if you manage to think of a couple of good domain names, they might be already taken. In such cases tools like the Tool below can come very handy.
File names and directory names are also important. Often search engines will give preference to pages that have a keyword in the file name. For instance http://mydomain.com/dog-adopt.html is not as good as http://dog-adopt.net/dog-adopt.html but is certainly better than http://mydomain.com/animal-care.html. The advantage of keywords in file names over keywords in URLs is that they are easier to change, if you decide to move to another niche, for example.
b. Keywords in Page Titles
The page title is another special place because the contents of the <title> tag usually gets displayed in most search engines, (including Google). While it is not mandatory per the HTML specification to write something in the <title> tag (i.e. you can leave it empty and the title bar of the browser will read “Untitled Document” or similar), for SEO purposes you may not want to leave the <title> tag empty; instead, you'd better write the the page title in it.
Unlike URLs, with page titles you can get wordy. If we go on with the dog example, the <title> tag of the home page for the http://dog-adopt.netcan include something like this: <title>Adopt a Dog – Save a Life and Bring Joy to Your Home</title>, <title>Everything You Need to Know About Adopting a Dog</title> or even longer.
c. Keywords in Headings
Normally headings separate paragraphs into related subtopics and from a literary point of view, it may be pointless to have a heading after every other paragraph but from SEO point of view it is extremely good to have as many headings on a page as possible, especially if they have the keywords in them.
There are no technical length limits for the contents of the <h1>, <h2>, <h3>, ... <hn> tags but common sense says that too long headings are bad for page readability. So, like with URLs, you need to be wise with the length of headings. Another issue you need to consider is how the heading will be displayed. If it is Heading 1 (<h1>), generally this means larger font size and in this case it is recommendable to have less than 7-8 words in the heading, otherwise it might spread on 2 or 3 lines, which is not good and if you can avoid it – do it.
- Chapter 3: Site Architecture & Search Engine Success Factors
Site Architecture And Search Engine Success Factors
The following major On-The-Page bunch in the Periodic Table Of SEO Success Factors is site engineering. The correct site structure can help your SEO endeavors thrive while the wrong one can handicap them.
Web indexes “slither” sites, traveling between different pages amazingly rapidly, acting like hyperactive speed-perusers. They make duplicates of your pages that get put away in what’s called a “record,” which resembles a gigantic book of the web.
When somebody looks, the web crawler flips through this enormous book, discovers all the applicable pages and after that chooses what it supposes are the absolute best ones to show first. To be observed, you must be in the book. To be in the book, you must be crept.
Each site is given a creep spending plan, a rough measure of time or pages a web index will slither every day, in view of the relative trust and expert of a website. Bigger destinations may try to enhance their slither proficiency to guarantee that the “right” pages are being crept all the more frequently. The utilization of robots.txt, inside connection structures and particularly advising web crawlers not to creep pages with certain URL parameters can all enhance slither effectiveness.
Be that as it may, for most, slither issues can be effortlessly maintained a strategic distance from. Also, it’s great practice to utilize sitemaps, both HTML and XML, to make it simple for web indexes to slither your website.
More Google inquiries occur on cell phones than on desktop. Given this current, it’s no big surprise that Google is remunerating destinations that are portable well disposed with a shot of better rankings on versatile ventures while those that aren’t might have a harder time showing up. Bing, as well, is doing likewise.
So get your site versatile well disposed. You’ll build your shot of progress with inquiry rankings as making your versatile guests upbeat. Furthermore, on the off chance that you have an application, consider making utilization of application ordering and connecting, which both web crawlers offer.
Once in a while that enormous book, the inquiry list, gets chaotic. Flipping through it, a web index may discover page after page after page of what looks like for all intents and purposes a similar substance, making it more troublesome for it to make sense of which of those many pages it ought to return for a given pursuit. This is bad.
It deteriorates if individuals are effectively connecting to various renditions of a similar page. Those connections, a marker of trust and expert, are all of a sudden part between those adaptations. The outcome is a misshaped (and lower) impression of the genuine esteem clients have doled out that page. That is the reason canonicalization is so imperative.
You just need one rendition of a page to be accessible to web indexes.
There are numerous ways copy renditions of a page can crawl into reality. A site may have www and non-www adaptations of the site as opposed to diverting one to the next. An internet business website may permit web crawlers to file their paginated pages. Be that as it may, nobody is look for “page 9 red dresses”. Or, on the other hand separating parameters may be annexed to a URL, making it look (to a web index) like an alternate page.
For the same number of courses as there are to make URL bloat unintentionally, there are approaches to address it. Appropriate execution of 301 sidetracks, the utilization of rel=canonical labels, overseeing URL parameters and powerful pagination systems can all assistance guarantee you’re running a tight ship.
For additional, see our classification that talks about duplication and canonicalization issues, SEO: Duplicate Content.
Google needs to make the web a quicker place and has proclaimed that rapid destinations get a little positioning preferred standpoint over slower locales.
Nonetheless, making your site rankling quick isn’t an ensured express ride to the highest point of query items. Speed is a minor variable that effects only 1 in 100 questions as indicated by Google.
In any case, speed can fortify different variables and may really enhance others. We’re a restless group of people nowadays, particularly when we’re on our cell phones! So engagement (and transformation) on a site may enhance in view of a rapid load time.
Accelerate your site! Web indexes and people will both welcome it.
The following is some of our past scope of the significance of site speed:
Website optimization: Site Speed
Are Your URLs Descriptive?
Yes. Having the words you need to be found for inside your space name or page URLs can help your positioning prospects. It’s not a main consideration but rather in the event that it bodes well to have expressive words in your URLs, do as such. The articles in the class beneath investigate the energy of the URL in more profundity:
Web optimization: Domain Names and URLs
Google might want to see the whole web running HTTPS servers, with a specific end goal to give better security to web surfers. To help get this going, it rewards locales that utilization HTTPS with a little positioning lift.
Similarly as with the webpage speed support, this is only one of many elements Google utilizes when choosing if a website page ought to rank well. Only it doesn’t ensure getting into the top outcomes. Yet, in the event that you’re contemplating running a protected site in any case, at that point this may help add to your general inquiry achievement.
Site Architecture & Search Engine Success Factors
Are you ready for search engine success?
Site Architecture and Search Engine Success Factors
Site engineering or putting up in simple terms "site structuring" is the major On-The-Page flock in the Periodic Table of SEO Success Factors. Appropriate site architecture and formation can help your SEO flourish while the incorrect one can curtail them.
You need to be cautious with how your site handles duplicate content, which is a especially worrisome for ecommerce or 'catalogue' type sites, where pagination and searches can cause issues if not dealt properly.
Search engines are continually modifying and improving their algorithms that result in move towards in organic rankings. You can, for all that, protect your site from any considerable effect by following a few fundamental guidelines as accentuated in the period table. The various Search Engine Success Factors for Site Architecture involves the following:
Your site's architecture should set the seal on that search engine spiders can conveniently crawl and access your site. It is ideally good to avoid big files, images, videos or code that slows down your site speed. Apply user-friendly URL structure and ensure that the words used are suitable to the page topic.
Good structure: www.sitename.co.uk/topic/keyword
Bad structure: www.sitename.co.uk/category?1/pageid&=45
Web indexes "slither" websites, moving across different pages impressively quickly, acting like overzealous speed-pursuers. They also create replicas or what we say duplicate copies of your web pages that get stored away in what’s being termed as a "record," which depicts a boundless book of the web.
When a user views, the web crawler upturns through this gigantic book, explores all the applicable pages and after that selects what it assumes are the outright superior ones to show first. To be viewed upon, you must be there in the book. To be in the book, you must be crawled.
Each site is provided a crawl spending scheme, an approximate calculation of pages or time a web index will creep every day, in order of the relative reliance and expertise of a website. Larger destinations may attempt to improve their creeping competency to assure that the "correct" pages are being slithered all the more repeatedly. The use of the robots.txt, inside connection composition and specifically instructing web crawlers not to crawl pages with specific URL criterion can all improve crawling optimality.
Be that as it may, for by and large, crawl concerns can be conveniently maintained a tactful distance from. Also, it’s an effective activity to utilize both HTML and XML, sitemaps, to make it easy for web indexes to crawl your website.
Great amount of Google inquiries takes place on mobile phones than on desktop. Given this current, it’s no big a bewilderment that Google is reimbursing destinations that are portable well liable with a bang of better ranks on multi-faceted thrives while those that aren’t might have a tough time appearing up. Also, Bing is doing the similar.
So get your dynamic site well disposed. You will set up your shot of success with inquiry rankings as leading your versatile guests’ encouraging. In addition, on the off likeliness that you obtain an application, account for making utilization of application ordering and associating, which both web creepers offer.
Mobile-friendliness has accelerated in weight as an outcome of the Google’s smartphone search algorithm change and while not a largely weighted factor, we mark that a secure site (https) now constitute as a SEO success factor.
From time to time, that gigantic book, the inquiry list, gets disorganized. Flipping across its different pages, a web index may identify page after page after page consecutively of what appears to be for all sorts and deliberations a similar substance, making it more problematic for it to rationalize of which of those many pages it obliged to return for a given quest. This is not good.
It undermines if people are productively connecting to different renditions of a likewise page. Those connections, a marker of trust, authority and expert, are all of a rapid element between those adaptations. The result is deformed (and lower) consequence of the authentic idolization clients have shell out that page. That is the reason canonicalization is so necessary.
You only require one rendition of a page to be reachable to web indexes.
There are infinite ways copy renditions of a page can slither into reality. A website may have both www and non-www adaptations of the site as set against to forwarding one to the next. An internet business website may grant web slithers to file their paginated pages. Nonetheless, nobody is view for "page 8 red dresses". Moreover, on the contrary disconnecting parameters may be joined to a URL, making it appear (to a web index) like an alternate page.
For the same number of courses as there are to make URL inflate non-purposefully, there are several strategies to address it. Effective implementation of 301 sidetracks, the use of rel=canonical labels, overlooking URL parameters and strong pagination processes can all direct assurance you are operating a tight ship.
Google requires making the web a rapid place and has declared that rapid destinations get a lesser placing picked standpoint over gradual locales.
Be that as it may, turning your website ranking rapid fast is not a guaranteed express journey to the highest tip of query objects. Speed is a little variable that would impact only 1 in 100 questions as marked by Google.
In any situation, speed can reinforce many variables and may actually improve others. We are a fidgety bunch of individuals in present times, specifically when we are on our Smartphones! So involvement and conversion on a site may improvise in aspects of a quick load time.
Prompt your site! Web indexes and individuals will both greet it.
The subsequent is some of our past range of the importance of site speed:
Site optimization: Site Speed
Are Your URLs Descriptive?
Yes. Having the words you want to be found for inside your page URLs or space name can aid your positioning potential. However, it is not a core consideration but instead in the situation that it predicts well to have descriptive words in your URLs, do as such. The write-ups in the category below survey the vitality of the URL in more profoundness:
Website optimization: URLs and Domain Names
Google may want to watch the whole web operating HTTPS servers, with a particular ultimate motive to offer enhanced security to web surfers. To enable get this running, it remunerates locales that utilization HTTPS with a slight positioning boost.
Likewise as with the webpage speed assistance, this is only one of many segments Google utilizes when selecting if a website page is committal to rank well. Only it does not guarantee getting into the top SERP results. Even so, in the situation that you are considering operating a protected site in any case, at that point this may help support to your usual inquiry procurement.
III. Backlinks – Another Important SEO Item
What are Backlinks?
In layman's terms, there are two types of links: inbound and outbound. Outbound links start from your site and lead to an external site, while inbound links or backlinks, come from an external site to yours. e.g. if cnn.com links to yourdomain.com, the link from cnn.com is a backlink (inbound) for yourdomain.com, however the link is an outbound link from cnn.com's perspective. Backlinks are among the main building blocks to good Search Engine Optimisation (SEO).
Why Backlinks Are Important
The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines like Google, give more credit to websites that have a large number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.
Therefore, when search engines calculate the relevance of a site to a keyword, they not only consider the number of backlinks to that site but also their quality. In order to determine the quality, a search engine considers the content of the sites. When backlinks to your site come from other sites, and those sites have content related to your site, these backlinks are considered more relevant to your site. If backlinks are found on sites with unrelated content, they are considered less relevant. The higher the relevance of backlinks, the greater their quality.
For example, if a webmaster has a website about how to rescue orphaned dogs, and received a backlink from another website about dogs, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. Therefore, higher the relevance of the site linking back to your website, the better the quality of the backlink.
Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to modify your webpages to make them more SEO friendly it is a lot harder for you to influence other websites and get them to link to your website. This is the reason search engines regard backlinks as a very important factor. Further, search engine's criteria for quality backlinks has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these backlinks by deceptive or sneaky techniques, such as hidden links, or automatically generated pages whose sole purpose is to provide backlinks to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.
When a link incorporates a keyword into the text of the hyperlink, we call this anchor text. A link's anchor text may be one of the most powerful resources a webmaster has. Backlinks from multiple websites with the anchor text "orphaned dogs" would help your website rank higher for the keyword "orphaned dogs". Using your keyword is a superior way to utilize a hyperlink as against having links with words like "click here" which do not relate to your website. The 'Backlink Anchor Text Analysis Tool' is a tool which will assist you find your backlinks and the text which is being used to link to your website. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something which incorporates relevant keywords. This will also help boost your rankings.
Ways to Build Backlinks
Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome.
1 The Backlink Builder Tool
When you enter the keywords of your choice, the Backlink Builder tool gives you a list of relevent sites from where you might get some backlinks.
2 Getting Listed in Directories
If you are serious about your Web presence, getting listed in directories like DMOZ,Yahoo and Jasmin directory is a must, not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.
3 Forums and Article Directories
Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.
4 RSS Feeds
You can offer RSS feeds to interested sites for free, when the other site publishes your RSS feed you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.
5 Affiliate programs
Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?
6 News Announcements and Press Releases
Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites that publish news announcements and press releases for free or for a small fee . A professionally written press release about an important event can bring you many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.
Link Practices That Are To Be Avoided
There is much discussion in these last few months about reciprocal linking. In the past few Google updates, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant backlinks were ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.
There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple.
Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.
One thing is certain, interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.
What are Meta tags ?
Meta tags are used to summarize information of a page for search engine crawlers. This information is not directly visibles to humans visiting your website. The most popular are the meta keywords and description tag. These meta tags to be inserted into the area of your page.
A couple of years ago meta tags were the primary tool for search engine optimization and there was a direct correlation between keywords in the meta tags and your ranking in the search results. However, algorithms have got better and today the importance of metadata is decreasing day by day.
The meta Description tag is are one more way for you to write a description of your site, thus pointing search engines to what themes and topics your Web site is relevant to. Some search engines (including Google) use these meta description display a summary of the listings on the search results page. So if your meta descriptions are well written you might be able to attract more traffic to your website.
For instance, for the dog adoption site, the meta Description tag could be something like this:
<Meta Name=“Description“ Content=“Adopting a dog saves a life and brings joy to your house. All you need to know when you consider adopting a dog.“>
A potential use of the Meta Keywords tags is to include a list of keywords that you think are relevant to your pages. The major search engines will not take this into account but still it is a chance for you to emphasize your target keywords. You may consider including alternative spellings (or even common misspellings of your keywords) in the meta Keywords tag. It might be a very small boost to your search engine rankings but why miss the chance?
<Meta name=“Keywords“ Content=“adopt, adoption, dog, dogs, puppy, canine, save a life, homeless animals“>
In this tag you specify the pages that you do NOT want crawled and indexed. It happens that on your site you have contents that you need to keep there but you don't want it indexed. Listing this pages in the Meta Robots tag is one way to exclude them (the other way is by using a robots.txt file and generally this is the better way to do it) from being indexed.
<META NAME=“ROBOTS“ CONTENT=“NOINDEX, NOFOLLOW“>
V. Content Is King
If you were writing SEO text solely for machines, optimization would be simple. Sprinkle in some keywords, rearrange them at random and watch the hit counter skyrocket. Sometimes SEO copy writers forget that this isn't the case. Real people read your text and they expect something in return for the time and attention they give you. They expect good content, and their expectations have shaped how search engines rank your site.
What Is Good Content?
Good SEO content has three primary characteristics:
- Offers useful information presented in an engaging format to human readers
- Boosts search engine rankings
- Attracts plenty of links from other sites
Note that human readers come first on the list. Your site must deliver value to its visitors and do it in an engaging way. Few sites specialize in a subject so narrow that they have an information niche all to themselves. You'll have competition. Set yourself apart from it with expert interviews, meaningful lists and well-researched resources. Write well or invest in someone who does; your investment will pay off in increased traffic.
Although search engines aren't your primary audience, they still influence your page rankings. In the days of early SEO, using keyword-stuffed META tags brought in plenty of traffic. People didn't hang around on a site that promised low air fares and delivered advertisements, but that didn't affect the search engines. Each iteration of the engines' algorithms got better at discerning valuable sites from clutter, though, so site creators had to sharpen their technique as well. Instead of META tags, they used keywords sprinkled throughout an article.
In April 2011, Google's algorithm change devalued keyword and keyphrase "spam" in favor of more nuanced means of determining a web site's value to viewers. This update sent ripples throughout the Internet. From major commerce sites to hobbyists' blogs, search engines boosted high-value sites and cast down some once-mighty sites that relied too much on keyword-stuffing. Keywords haven't lost their value, but they no longer provide the only cue to search engines.
If SEO keywords have become devalued, links have grown in value. If other sites link to yours as an engaging read, controversial screed or authoritative text, search engines view your page as a site that viewers will want to see and bump it up accordingly. Filling your site with link bait will get you noticed by search engines and the people who use them, and the best way to draw links is with strong, fresh content. Social media sites provide even more buzz for pages with great content. Those links count too, so court them with content-rich pages.
Writing SEO Content for Search Engines -- And for People
SEO no longer means scattering keywords like Hansel and Gretel throwing breadcrumbs. The newest search engines scan pages almost as your readers might. Jakob Nielsen, a researcher and expert in human-machine interaction at the Technical University of Copenhagen, found that almost 80 percent of a web site's visitors scanned the page rather than reading it line by line. They spent their first fractions of a second on the page deciding if it was worth their time. Search engine programmers still use this research to devise algorithms that provide more organic and meaningful rankings.
The same things that catch a visitor's eye will get a search engine's attention. The upper left corner of the page is the most valuable real estate on the page, as it's where a reader's eyes go first. Put important text there so search engines and people will see it immediately. It's also a good spot for boxed text and itemized lists, both of which appeal equally to carbon-based and silicon-based brains.
Bold text makes people and machines notice, but use those tags judiciously. Too much bold text looks like an advertisement and will cause search engines to devalue your site. Italic text bold HTML tags should surround meaningful concepts, not emphasis words. Bolding a "very" or italicizing a "more" means nothing to a search engine, so apply those tags to important concepts and sub-headings.
Searches now look for associated terms and relevant phrases, not just keywords. A person picks up meaning from context and readily distinguishes the term "clipping" as it applies to hair from the same word as it refers to film stock or video game graphics. Let your visitors -- human and machine -- know whether you're talking about German shepherds as a dog breed or as an exciting career in European wool and mutton. In your SEO text, include synonyms and relevant terms to let search engines recognize the purpose of your site.
Happily, there's a way to work these terms into your content without monitoring keyword and keyphrase percentages: simply write the kind of engaging copy that people like to read. If you write for readers, the search engines will follow.
SEO Killers - Duplicate Content, Spam and Filler
You have a handle on what modern SEO content should be, but it's also vital to understand what it shouldn't be. Nielsen's research described what kept readers on web sites and shed light on what drove them away. Search engines take these same factors into account and rank pages down or even remove them from ranking altogether.
Duplicate content can sink a site. Even legally obtained duplicate content such as articles linked whole from news feeds and large blocks of attributed quotes diminish a site's SEO value. Readers have no reason to visit a site that gives them other sites' news verbatim. Page ranks will decline over time without original content.
While you don't want large blocks of duplicate content on your site, you want the timely information that your news feeds deliver. Build fresh new content on the foundation of other information whenever possible. It takes more effort to assimilate and summarize a news story or to use it as a link within an original article, but doing so will cast your site in a more positive light. If you add sufficient value with sharp writing and relevant links, you'll find yourself in the search engine stratosphere.
The old method of following keyword formulas and meeting keyword percentages is not only outdated, it will actively lower your site's rank. Heavy keyword-loading is the hallmark of advertising web sites, and search engines know it. Using related words and relevant phrases to enhance topic recognition marks your site as valuable and drives its search engine value higher. Varied writing is also more readable to your human visitors.
Nielsen found that human readers shunned sites full of filler phrases. Clear, concise web writing has greater value than sprawling pages full of fluff. Hyperbole and promotional language -- describing a product as "the best ever" or "the perfect solution," for example -- contributes nothing to the meaning of the text. Human readers filter out fluff and software ranks down sites with too much of it, so eliminate it from your site.
Search engines change their algorithms regularly in an effort to provide their users with more relevant results. The state of SEO art changes with them. The only constant in web writing is its human audience. Pages that provide novel, appealing content in a reader-friendly format will rise to the top of the rankings.
Try the Similar Page Checker to check the similarity between two URLs.
Content Is King forever.
VI. Visual Extras and SEO
- Chapter 4: HTML Code & Search Engine Success Factors
- Images are an essential part of any Web page and from a designer point of view they are not an extra but a most mandatory item for every site. However, here designers and search engines are on two poles because for search engines every piece of information that is buried in an image is lost. When working with designers, sometimes it takes a while to explain to them that having textual links (with proper anchor text) instead of shining images is not a whim and that clear text navigation is really mandatory. Yes, it can be hard to find the right balance between artistic performance and SEO-friendliness but since even the finest site is lost in cyberspace if it cannot be found by search engines, a compromise to its visual appearance cannot be avoided.
- With all that said, the idea is not to skip images at all. Sure, nowadays this is impossible because the result would be a most ugly site. Rather the idea is that images should be used for illustration and decoration, not for navigation or even worse – for displaying text (in a fancy font, for example). And the most important – in the <alt> attribute of the <img> tag, always provide a meaningful textual description of the image. The HTML specification does not require this but search engines do. Also, it does not hurt to give meaningful names to the image files themselves rather than name them image1.jpg, image2.jpg, imageN.jpg. For instance, in the next example the image file has an informative name and the alt provides enough additional information: <img src=“one_month_Jim.jpg” alt=“A picture of Jim when he was a one-month puppy”>. Well, don't go to extremes like writing 20-word <alt> tags for 1 pixel images because this also looks suspicious and starts to smell like keyword-stuffing.
- Animation and Movies
- The situation with animation and movies is similar to that with images – they are valuable from a designer's point of view but are not loved by search engines. For instance, it is still pretty common to have an impressive Flash introduction on the home page. You just cannot imagine what a disadvantage with search engines this is – it is a number one rankings killer! And it gets even worse, if you use Flash to tell a story that can be written in plain text, hence crawled and indexed by search engines. One workaround is to provide search engines with a HTML version of the Flash movie but in this case make sure that you have excluded the original Flash movie from indexing (this is done in the robots.txt file but the explanation of this file is not a beginners topic and that is why it is excluded from this tutorial), otherwise you can be penalized for duplicate content.
- There are rumors that Google is building a new search technology that will allow to search inside animation and movies and that the .swf format will contain new metadata that can be used by search engines, but until then, you'd better either refrain from using (too much) Flash, or at least provide a textual description of the movie (you can use an <alt> tag to describe the movie).
- It is a good news that frames are slowly but surely disappearing from the Web. 5 or 10 years ago they were an absolute hit with designers but never with search engines. Search engines have difficulties indexing framed pages because the URL of the page is the same, no matter which of the separate frames is open. For search engines this was a shock because actually there were 3 or 4 pages and only one URL, while for search engines 1 URL is 1 page. Of course, search engines can follow the links to the pages in the frameset and index them but this is a hurdle for them.
- If you still insist on using frames, make sure that you provide a meaningful description of the site in the <noframes> tag. The following example is not for beginners but even if you do not understand everything in it, just remember that the <noframes> tag is the place to provide an alternative version (or at least a short description) of your site for search engines and users whose browsers do not support frames. If you decide to use the <noframes> tag, maybe you'd like to read more about it before you start using it.
- Example: <noframes> <p> This site is best viewed in a browser that supports frames. </p><p> Welcome to our site for prospective dog adopters! Adopting a homeless dog is a most noble deed that will help save the life of the poor creature. </p></noframes>
VII. Static Versus Dynamic URLs
Based on the previous section, you might have gotten the impression that the algorithms of search engines try to humiliate every designer effort to make a site gorgeous. Well, it has been explained why search engines do not like image, movies, applets and other extras. Now, you might think that search engines are far too cheeky to dislike dynamic URLs either. Honestly, users are also not in love with URLs like http://domain.com/product.php?cid=1&pid=5 because such URLs do not tell much about the contents of the page.
There are a couple of good reasons why static URLs score better than dynamic URLs. First, dynamic URLs are not always there – i.e. the page is generated on request after the user performs some kind of action (fills a form and submits it or performs a search using the site's search engine). In a sense, such pages are nonexistent for search engines, because they index the Web by crawling it, not by filling in forms.
Second, even if a dynamic page has already been generated by a previous user request and is stored on the server, search engines might just skip it if it has too many question marks and other special symbols in it. Once upon a time search engines did not index dynamic pages at all, while today they do index them but generally slower than they index static pages.
The idea is not to revert to static HTML only. Database-driven sites are great but it will be much better if you serve your pages to the search engines and users in a format they can easily handle. One of the solutions of the dynamic URLs problem is called URL rewriting. There are special tools (different for different platforms and servers) that rewrite URLs in a friendlier format, so they appear in the browser like normal HTML pages. Try the URL Rewriting Tool below, it will convert the cryptic text from the previous example into something more readable, like http://mydomain.com/product-categoryid-1-productid-5.
- Chapter 5: Trust, Authority, Identity & Search Rankings
- Trust, Authority, Identity & Search Rankings
- Webpage Engagement,
- Trust for Web page,
- Trust,Authority,Identity for Wep Page And Search Rankings,
- If search engines can decide to trust links or social accounts, can they learn to trust websites? Absolutely. Many SEOs believe that site trust plays a big role in whether a site will succeed or fail from a search perspective.
- Ta: Authority
- Is your site an authority? Is it a widely recognized leader in its field, area, business or in some other way? That’s the goal.
- No one knows exactly how search engines calculate authority and, in fact, there are probably multiple “authority” signals. The type of links your site receives (lots of quality or ‘neighborhood’ links?) or social references (from respected accounts?) and engagement metrics (long clicks?) may all play a role in site authority. Of course, negative sentiment and reviews may hurt site authority.
- There’s little doubt that search engines try to assess authority. One only needs to look through the questions Google told publishers to ask themselves in building high-quality sites that should be immune to “Panda” updates. The words trust, authority and expertise are all frequently mentione
- Te: Engagement
- A quality site should produce meaningful interactions with users. Search engines may try to measure this interaction – engagement – in a variety of ways.
- For example, how long do users stay on your page? Did they search, click through to your listing but then immediately “bounce” back to the results to try something else? That “pogosticking” behavior can be measured by search engines and could be a sign that your content isn’t engaging.
- Conversely, are people sending a relatively long time reviewing your content, in relation to similar content on other sites? That “time on site” metric or “long click” is another type of engagement that search engines can measure and use to assess the relative value of content.
- Social gestures such as comments, shares and “likes” represent another way that engagement might be measured.
- Search engines are typically cagey about the use of engagement metrics, much less the specifics of those metrics. However, we do believe engagement is measured and used to inform search results.
- Trust,Authority,Identity for Wep Page And Search Rankings
- Th: History
- Since search engines are constantly visiting your website, they can get a sense of what’s “normal” or how you’ve behaved over time.
- Are you suddenly linking out to what the search engines euphemistically call “bad neighborhoods”? Are you publishing content about a topic you haven’t typically covered? Such things might raise alarm bells.
- Then again, sites do change just like people do, and often for the better. Changes aren’t taken in isolation. Other factors are also assessed to determine if something worrisome has happened.
- Similarly, a site with a history of violating guidelines and receiving multiple penalties may find it more difficult to work their way back to search prominence. We increased the weight of this factor, in part because we’re seeing that Google doesn’t forget things like Penguin easily.
- In the end, a good overall track record may help you. An older, established site may find it can keep cruising along with search success, while a new site may have to “pay its dues,” so to speak, for weeks, months or even longer to gain respect.
- Ti: Identity
- Search engines have explored various ways to help verify web sites as well as authors that are writing for them. Perhaps the most dramatic attempt was Google Authorship. While Google ended Google Authorship in 2014, the search engine still tries to assess authorship for use with Author Rank in other ways.
- Identity and authorship systems will likely continue to evolve. At the moment, among the best ways to tap into identity signals involve Klout for Bing, verifying sites with Bing Webmaster Tools and Google Search Console plus linking your site to Google+.
HTML Code & Search Engine Success Factors
Written by Yavuz Camgöz
Category: Search Engine Optimization
Published: 23 September 2015
- Header Tags,
- Structured Data,
- The Meta Description Tag,
- HTML Title Tag,
- Success of HTML Codes in Search Engine,
HTML is the underlying code used to create web pages. Search engines can pick up ranking signals from specific HTML elements. Below are some of the most important HTML elements to achieve SEO success.
Success of HTML Codes in Search Engine
Ht: HTML Title Tag
Imagine that you wrote 100 different books but gave them all the same exact title. How would anyone understand that they are all about different topics?
Imagine that you wrote 100 different books, and while they did have different titles, the titles weren’t very descriptive — maybe just a single word or two. Again, how would anyone know, at a glance, what the books were about?
HTML titles have always been and remain the most important HTML signal that search engines use to understand what a page is about. Bad titles on your pages are like having bad book titles in the examples above. In fact, if your HTML titles are deemed bad or not descriptive, Google changes them.
So think about what you hope each page will be found for, relying on the keyword research you’ve already performed. Then craft unique, descriptive titles for each of your pages.
Hd: The Meta Description Tag
The meta description tag, one of the oldest supported HTML elements, allows you to suggest how you’d like your pages to be described in search listings. If the HTML title is the equivalent to a book title, the meta description is like the blurb on the back describing the book.
SEO purists will argue that the meta description tag isn’t a “ranking factor” and that it doesn’t actually help your pages rank higher. Rather, it’s a “display factor,” something that helps how you look if you appear in the top results due to other factors.
Technically, that’s correct. And it’s one of the reasons we decided to call these “success” factors instead of ranking factors.
A meta description that contains the keywords searched for (in bold) may catch the user’s eye. A well-crafted meta description may help ‘sell’ that result to the user. Both can result in additional clicks to your site. As such, it makes sense for the meta description tag to be counted as a success factor.
Be forewarned, having a meta description tag doesn’t guarantee that your description will actually get used. Search engines may create different descriptions based on what they believe is most relevant for a particular query. But having one increases the odds that what you prefer will appear. And it’s easy to do. So do it.
Hs: Structured Data
What if you could tell search engines what your content was about in their own “language”? Behind the scenes, sites can use specific markup (code) that make it easy for search engines to understand the details of the page content and structure.
The result of structured data often translates into what is called a ‘rich snippet‘, a search listing that has extra bells and whistles that make it more attractive and useful to users. The most common rich snippet you’re likely to encounter is reviews/ratings which usually includes eye-catching stars.
While the use of structured data may not be a direct ranking factor, it is clearly a success factor. All things being equal, a listing with a rich snippet will get more clicks than one without. And search engines are eager for site owners to embrace structured data, providing new and easier ways for less tech-savvy webmasters to participate.
Structured data has been around for quite some time in various forms. But recently search engines have begun to rely on it more with the advent of Google’s Knowledge Graph and Bing’s Snapshot.
This element debuted in the previous edition of the periodic table, and in this edition we’ve increased the weight, as we see it becoming more important in the future.
Hh: Header Tags
See the headline up at the top of this page? Behind the scenes, HTML code is used to make that a header tag. In this case, an H1 tag.
See the sub-headlines on the page? Those also use header tags. Each of them is the next “level” down, using H2 tags.
Header tags are a formal way to identify key sections of a web page. Search engines have long used them as clues to what a page is about. If the words you want to be found for are in header tags, you have a slightly increased chance of appearing in searches for those words.
Naturally, this knowledge has caused some people to go overboard. They’ll put entire paragraphs in header tags. That doesn’t help. Header tags are as much for making content easy to read for users as it is for search engines.
Header tags are useful when they reflect the logical structure (or outline) of a page. If you have a main headline, use an H1 tag. Relevant subheads should use an H2 tag. Use headers as they make sense and they may reinforce other ranking factors.
Site Architecture & Search Engine Success Factors
- Web Site Crawlability,
- Success of Web Site Architecture in Search Engine,
- The next major On-The-Page group in the Periodic Table Of SEO Success Factors is site architecture. The right site structure can help your SEO efforts flourish while the wrong one can cripple them.
- Success of Web Site Architecture in Search Engine
- Ac: Web Site Crawlability
- Search engines “crawl” websites, going from one page to another incredibly quickly, acting like hyperactive speed-readers. They make copies of your pages that get stored in what’s called an “index,” which is like a massive book of the web.
- When someone searches, the search engine flips through this big book, finds all the relevant pages and then picks out what it thinks are the very best ones to show first. To be found, you have to be in the book. To be in the book, you have to be crawled.
- Each site is given a crawl budget, an approximate amount of time or pages a search engine will crawl each day, based on the relative trust and authority of a site. Larger sites may seek to improve their crawl efficiency to ensure that the ‘right’ pages are being crawled more often. The use of robots.txt, internal link structures and specifically telling search engines not to crawl pages with certain URL parameters can all improve crawl efficiency.
- However, for most, crawl problems can be easily avoided. In addition, it’s good practice to use sitemaps, both HTML and XML, to make it easy for search engines to crawl your site Remember, “search engine friendly design” is also “human friendly design!”
- Ad: Duplication / Canonicalization
- Sometimes that big book, the search index, gets messy. Flipping through it, a search engine might find page after page after page of what looks like virtually the same content, making it more difficult for it to figure out which of those many pages it should return for a given search. This is not good.
- It gets even worse if people are actively linking to different versions of the same page. Those links, an indicator of trust and authority, are suddenly split between those versions. The result is a distorted (and lower) perception of the true value users have assigned that page. That’s why canonicalization is so important.
- You only want one version of a page to be available to search engines.
- There are many ways duplicate versions of a page can creep into existence. A site may have www and non-www versions of the site instead of redirecting one to the other. An e-commerce site may allow search engines to index their paginated pages. But no one is search for “page 9 red dresses”. Or filtering parameters might be appended to a URL, making it look (to a search engine) like a different page.
- For as many ways as there are to create URL bloat inadvertently, there are ways to address it. Proper implementation of 301 redirects, the use of rel=canonical tags, managing URL parameters and effective pagination strategies can all help ensure you’re running a tight ship.
- Am: Mobile Friendly
- More Google searches happen on mobile devices than on desktop. Given this, it’s no wonder that Google is rewarding sites that are mobile friendly with a chance of better rankings on mobile searches while those that aren’t might have a harder time appearing. Bing, too, is doing the same.
- So get your site mobile friendly. You’ll increase your chance of success with search rankings as making your mobile visitors happy. In addition, if you have an app, consider making use of app indexing and linking, which both search engines offer.
- As: Site Speed
- Google wants to make the web a faster place and has declared that speedy sites get a small ranking advantage over slower sites
- However, making your site blistering fast isn’t a guaranteed express ride to the top of search results. Speed is a minor factor that impacts just 1 in 100 queries according to Google.
- But speed can reinforce other factors and may actually improve others. We’re an impatient bunch of folks these days, especially when we’re on our mobile devices! So engagement (and conversion) on a site may improve based on a speedy load time.
- Speed up your site! Search engines and humans will both appreciate it.
- Au: Are Your URLs Descriptive?
- Having the words you want to be found for within your domain name or page URLs can help your ranking prospects. It’s not a major factor but if it makes sense to have descriptive words in your URLs, do so.
On the off chance that web crawlers can choose to trust connections or social records, would they be able to figure out how to trust sites? Totally. Numerous SEOs trust that site trust assumes a major part in whether a site will succeed or fall flat from a hunt point of view.
Is your site a power? Is it a generally perceived pioneer in its field, territory, business or in some other way? That is the objective.
Nobody knows precisely how web crawlers ascertain power and, truth be told, there are most likely different "power" signals. The kind of connections your site gets (loads of value or "neighborhood" connections?) or social references (from regarded records?) and engagement measurements (long snaps?) might all assume a part in site power. Obviously, negative assumption and surveys may hurt site power.
Little uncertainty internet searchers attempt to evaluate power. One just needs to look through the inquiries Google advised distributers to ask themselves in building amazing destinations that ought to be safe to "Panda" upgrades. The words trust, power and mastery are all every now and again said.
A quality site ought to deliver significant associations with clients. Web indexes may attempt to gauge this cooperation – engagement – in an assortment of ways.
For instance, to what extent do clients keep focused page? Did they hunt, navigate to your posting yet then quickly "ricochet" back to the outcomes to have a go at something else? That "pogosticking" conduct can be measured via web indexes and could be an indication that your substance isn't locks in.
On the other hand, are individuals sending a generally long time evaluating your substance, in connection to comparable substance on different destinations? That "time nearby" metric or "long snap" is another kind of engagement that web indexes can quantify and use to survey the relative estimation of substance.
Social signals, for example, remarks, shares and "likes" speak to another way that engagement may be measured. We'll cover these in more noteworthy subtle element in the Social segment of this aide.
Web search tools are normally cagey about the utilization of engagement measurements, a great deal less the specifics of those measurements. On the other hand, we do trust engagement is measured and used to educate query items.
More data about engagement is accessible in the accompanying class:
Since web search tools are continually going by your site, they can get a feeling of what's "ordinary" or how you've carried on after some time.
Is it true that you are abruptly connecting out to what the web crawlers indirectly call "terrible neighborhoods"? Is it accurate to say that you are distributed substance around a theme you haven't normally secured? Such things may raise alerts.
On the other hand, destinations do change simply like individuals do, and regularly to improve things. Changes aren't taken in confinement. Different variables are likewise evaluated to figure out whether something troubling has happened.
Correspondingly, a site with a background marked by abusing rules and getting different punishments may think that its more hard to work their way back to inquiry unmistakable quality. We expanded the heaviness of this component, to a limited extent in light of the fact that we're seeing that Google doesn't overlook things like Penguin effectively.
At last, a great general reputation may help you. A more seasoned, set up site may discover it can continue cruising alongside inquiry achievement, while another site may need to "pay its levy," in a manner of speaking, for a considerable length of time, months or significantly more to pick up appreciation.
You can likewise read up on articles which take a gander at area enlistment issues:
SEO: Domains and URLs
Web indexes have investigated different approaches to check sites and writers that are composing for them. Maybe the most sensational endeavor was Google Authorship. While Google finished Google Authorship in 2014, the web crawler still tries to survey initiation for use with Author Rank in different ways.
Personality and initiation frameworks will probably keep on advancing. Right now, among the most ideal approaches to take advantage of personality signs include Klout for Bing, checking destinations with Bing Webmaster Tools and Google Search Console in addition to connecting your webpage to Google.
Trust, Authority, Identity & Search Rankings
Trust, Authority, Identity & Search Rankings
On the off chance that web indexes can choose to trust connections or social records, would they be able to figure out how to trust sites? Completely. Numerous SEOs trust that site trust assumes a major part in whether a site will succeed or come up short from a pursuit point of view.
Is your site a power? Is it a broadly perceived pioneer in its field, region, business or in some other way? That is the objective.
Nobody knows precisely how web crawlers figure power and, indeed, there are most likely numerous "power" signals. The kind of connections your site gets (loads of value or "neighborhood" connections?) or social references (from regarded records?) and engagement measurements (long snaps?) may all assume a part in site power. Obviously, negative assumption and surveys may hurt site power.
Little uncertainty internet searchers attempt to survey power. One just needs to look through the inquiries Google advised distributers to ask themselves in building great destinations that ought to be safe to "Panda" redesigns. The words trust, power and skill are all often said.
A quality site ought to deliver significant collaborations with clients. Web crawlers may attempt to quantify this association – engagement – in an assortment of ways.
For instance, to what extent do clients remain focused page? Did they look, navigate to your posting yet then instantly "ricochet" back to the outcomes to have a go at something else? That "pogosticking" conduct can be measured via web crawlers and could be an indication that your substance isn't locks in.
Alternately, are individuals sending a moderately long time exploring your substance, in connection to comparative substance on different locales? That "time nearby" metric or "long snap" is another sort of engagement that web search tools can quantify and use to evaluate the relative estimation of substance.
Social signals, for example, remarks, shares and "likes" speak to another way that engagement may be measured. We'll cover these in more noteworthy point of interest in the Social segment of this aide.
Internet searchers are ordinarily cagey about the utilization of engagement measurements, considerably less the specifics of those measurements. Be that as it may, we do trust engagement is measured and used to educate indexed lists.
More data about engagement is accessible in the accompanying classification:
- SEO: Engagement
Since web crawlers are always going to your site, they can get a feeling of what's "ordinary" or how you've acted after some time.
Is it accurate to say that you are all of a sudden connecting out to what the web search tools indirectly call "terrible neighborhoods"? It is safe to say that you are distributed substance around a subject you haven't ordinarily secured? Such things may raise alerts.
Of course, destinations do change simply like individuals do, and regularly to improve things. Changes aren't taken in disengagement. Different variables are likewise evaluated to figure out whether something troubling has happened.
Thus, a site with a past filled with abusing rules and accepting numerous punishments may think that its more hard to work their way back to hunt unmistakable quality. We expanded the heaviness of this component, to a limited extent since we're seeing that Google doesn't overlook things like Penguin effectively.
At last, a great general reputation may help you. A more seasoned, set up site may discover it can continue cruising alongside hunt achievement, while another site may need to "pay its contribution," in a manner of speaking, for a considerable length of time, months or significantly more to pick up appreciation.
You can likewise read up on articles which take a gander at area enlistment issues:
- SEO: Domains and URLs
Web indexes have investigated different approaches to confirm sites and writers that are composing for them. Maybe the most emotional endeavor was Google Authorship. While Google finished Google Authorship in 2014,,the web search tool still tries to survey initiation for use with Creator Rank in different ways.
Character and initiation frameworks will probably keep on evolving. Right now, among the most ideal approaches to take advantage of character signs include Klout for Bing, confirming destinations with Bing Webmaster Tools and Google Search Console in addition to connecting your webpage to Google+.
- Chapter 6: Link Building & Ranking In Search Engines
Link Building - How to Build Links for Free
Link building (alternative spellings include linkbuilding and link-building) refers to the process of getting external pages to link to a page on your website. It is one of the many tactics used in search engine optimization (SEO). Building links is a difficult, time-consuming process as not all links are created equal. A link from an authoratative website like the Wall Street Journal will make a greater impact on a SERP than a link from a newly built website, but high quality links are hard to come by. This guide will teach you how to build quality links.
Remember, link building is imperative in achieving high organic search rankings.
How to Build Links
To implement effective link building, it's important to understand why link building is important and how it works.
With this link building guide, you will learn:
- The definition of link building
- Why link building is an important part of SEO
- How to build reliable and trustworthy links efficiently and with minimal or zero cost
- Link Building tool, tips, and additional resources
Link Building : Why It's Important
Link building is important because it is a major factor in how Google ranks Web pages. Google notes on their site that:
"In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages."
Imagine that we own a site promoting wind turbine equipment that we sell. We're competing with another wind turbine equipment manufacturer. One of the ranking factors Google will look at in determining how to rank our respective pages is link popularity.
While the above example provides a general visual understanding of how page rank works and why link building is important, it's very basic. It omits key factors such as:
- The trust and authority of the linking pages.
- The SEO and content optimizationof the respective sites.
- The anchor text of the incoming links.
For a more in-depth explanation of how PageRank is calculated, read through these resources:
- The original Google PageRank paper
- The Future of PageRank: 13 Experts on the Evolving PageRank Algorithm
- An in-depth discussion of the formula behind PageRank
- The Wikipedia page on the subject
The most important concept to understand is that, as Google says, you're more likely to have your content rank higher for keywords you're targeting if you can get external websites to link to you.
Link Building Strategies: How To Get Other Sites Linking to You
There are a number of link building strategies used to get external websites to link to yours:
- Content Creation & Promotion- Create compelling content that people will want to reference and link to, and tell people about it.
- Submissions- Submit your news to press releases, submit your site to directories, etc.
- Reviews & Mentions- Put your product, service, or site in front of influential bloggers.
- Links from Friends & Partners- Get people you know and people you work with to link to your site.
There are several other resources that will offer you more extensive and granular lists of ways to get links, but there are two main issues with all of these tips and tricks for developing inbound links to your website:
- They take a lot of time- Building out quality content and developing links from parties who may be interested takes a long time. It also requires certain resources: good copywriters, man hours to dedicate to the promotion of your goods or services, etc.
- They're dependant on external forces- You're reliant upon sites you don't control. This means you're unable to have a say in the quality of the linking page, the words they use to talk about your product, or the specific pages on your site they link to.
Best Ways to Build Links for Free
There's a better (free) link building strategy to build links to the pages you're attempting to improve search engine rankings for. In attempting to get a Web page to rank, there are a few key factors to consider:
- Anchor Text - One of the most important things search engines take into account in ranking a page is the actual text a linking page uses to talk about your content. So if someone links to our Good Guys Wind Turbine Parts site with the text "wind turbine parts", that will help us to rank highly for that keyword phrase, whereas if they had simply used text like "Good Guys LLC" to link to our site, we wouldn't enjoy the same ranking advantage for the phrase "wind turbine parts".
- Quality of the Linking Page- Another factor taken into account is the quality of the page that is sending the link; search engines allow links from high-quality, trusted pages to count more in boosting rankings than questionable pages and sites.
- Page the Link is Aimed At - Many times, when people talk about your site they'll link to the home page. This makes it difficult for individual pages to achieve high rankings (because it's so difficult for them to generate their own link equity).
These are all elements we can't control in attempting to get other sites to link to us. We can, however, control all of these elements in linking to our own pages from our own content.
- Determine what anchor text to use.
- Decide which page to point that anchor text at.
- Ensure that the quality and content of the linking page is high (since it's our page!).
Building external links to your site is important, but in focusing more of your efforts on the optimization of these internal links you can build quality in-bound links with rich anchor text to the proper pages, which will provide you with an unparalelled ranking boost (for free!).
Internal Link Building Tools and Tips
So how do you go about building these great internal links? Well, you can set up a system for interlinking your pages in a few easy steps:
- Keyword Research for Link Building - First, you need to utilize a keyword research toolto have numerous keywords suggested to you that are both relevant and popular.
- Assign Keywords to Content - Next, you have to group your keywordsstrategically, creating a search-friendly information architecture.
- Link Pages Using Targeted Anchor Text - The final step is to apply your keyword research to intelligent inter-linking; you do this by linking to content using the keywords you've discovered.
The execution of the third bullet is key. You need to be sure that you're linking to the right pages with the right anchor text. Here are a couple quick tips for carrying that out effectively:
Use Your Site Search
This one's pretty simple, and can be used for multiple purposes:
- Finding pages on your site to link to a new page- When you create new content, you want to make sure you can search your site for mentions of similar keyword variations you might want to link to that page.
- Finding a page that's been created to link to- Your site may have multiple content authors. In this case, you may have a vague idea that a page about "wind turbine rotors" has been created, but you don't know the page title or URL. In this case, you can either type the keyword into your site search to find the corresponding page, or use Google itself. To do this we'd simply type: "site:http://www.goodguyswindturbineparts.com intitle:wind turbine rotors" into Google. This would return all of the pages containing that phrase that Google has indexed.
Create an Internal SEO Link Building Wire Frame
To do this, you simply need to map the keywords you'd like to target to the most logical pages. So, let's say we have three pages to choose from:
Since the turbine rotors page definitely seems to be the best fit for our "wind turbine rotors" keyword, we'll align that keyword with that page.
We can similarly match "wind turbine parts" and "wind turbine shaft" with the corresponding pages. In a spreadsheet, this might look something like this:
As you can see, each page is associated with mutiple keywords. By making this document available to all of your content writers, they can quickly see which pages are targeting which keywords; they can also instantly check your SEO wireframe to see which keywords have been targeted (with a simple ctrl F!).
Link Building Strategies: Why is Link Building Important?
Now that we’ve reviewed what the heck link building is all about, we next move on to its importance. Why do we need to do linkbuilding? What are its immediate and direct effects on your website search engine results page rankings?
The Link Rule
We’re all trying to make our websites friendly to search engines to rank higher. It’s a no-brainer to get into the search game since it’s still the number one activity done online. So since we’re trying to rank higher, we have to play by the search engine’s rules.
And the rule is: More quality backlinks, rank higher.
It’s simple, really – the more links you have coming into your site just means that more people are referring you for something you’re good at. That ‘something you’re good at’ is what they put in as the anchor text – but that’s a whole new different topic altogether.
Links aren’t easy to get – just as people’s referrals aren’t easy to get. Unless, maybe, you pay them – which I personally don’t recommend. That’s why Google used the link factor – because people will only link to you if they find value in what you have to offer them.
Your number of Links is the only way Google can see how valuable your site is to other people. And the truth of the matter is, even if you have lots and lots of valuable content, if people don’t link to you, the search engines won’t acknowledge your site as valuable. Why?
Because search engines are machines. They’re blind about how valuable your site is. They can only see through links.
To sum it all up: More links -> More value -> Rank higher -> Happy Webmaster
Updates: Over the years, Google had been stricter with link building techniques that were implemented by SEO specialists or even those who claimed to be specialists. Now, it’s not the quantity that matters, but the quality of each link – will the reader be interested with where you direct him? Will he learn something? Is it useful? Is it really necessary to put that link there?
Don’t get me wrong here. I’m not saying that quantity doesn’t mean anything at all – of course we want links everywhere, but make sure that those are of value, and were planted on sites that are credible enough, or atleast soon-to-be credible.
We have an article that discusses why less links is better just in case you want further explanation.
This update doesn’t mean you don’t have to read the rest of our lessons about link building. The truth is link building is still a big factor for SEO success.
Link Building Resources That’ll Increase Your Search Rankings
VIII. Promoting Your Site to Increase Traffic
The main purpose of SEO is to make your site visible to search engines, thus leading to higher rankings in search results pages, which in turn brings more traffic to your site. And having more visitors (and above all buyers) is ultimately the goal in sites promotion. For truth's sake, SEO is only one alternative to promote your site and increase traffic – there are many other online and offline ways to do accomplish the goal of getting high traffic and reaching your target audience. We are not going to explore them in this tutorial but just keep in mind that search engines are not the only way to get visitors to your site, although they seem to be a preferable choice and a relatively easy way to do it.
1. Submitting Your Site to Search Directories, forums and special sites
After you have finished optimizing your new site, time comes to submit it to search engines. Generally, with search engines you don't have to do anything special in order to get your site included in their indices – they will come and find you. Well, it cannot be said exactly when they will visit your site for the first time and at what intervals they will visit it later but there is hardly anything that you can to do invite them. Sure, you can go to their Submit a Site pages in submit the URL of your new site but by doing this do not expect that they will hop to you right away. What is more, even if you submit your URL, most search engines reserve the right to judge whether to crawl your site or not. Anyway, here are the URLs for submitting pages in the three major search engines: Google, MSN, and Yahoo.
In addition to search engines, you may also want to have your site included in search directories as well. Although search directories also list sites that are relevant to a given topic, they are different from search engines in several aspects. First, search directories are usually maintained by humans and the sites in them are reviewed for relevancy after they have been submitted. Second, search directories do not use crawlers to get URLs, so you need to go to them and submit your site but once you do this, you can stay there forever and no more efforts on your side are necessary. Some of the most popular search directories are DMOZ and Yahoo! (the directory, not the search engine itself) and here are the URLs of their submissions pages: DMOZ and Yahoo! and Jasmine Directory
Sometimes posting a link to your site in the right forums or special sites can do miracles in terms of traffic. You need to find the forums and sites that are leaders in the fields of interest to you but generally even a simple search in Google or the other major search engines will retrieve their names. For instance, if you are a hardware freak, type “hardware forums” in the search box and in a second you will have a list of sites that are favorites to other hardware freaks. Then you need to check the sites one by one because some of them might not allow posting links to commercial sites. Posting into forums is more time-consuming than submitting to search engines but it could also be pretty rewarding.
2. Specialized Search Engines
Google, Yahoo!, and MSN are not the only search engines on Earth, nor even the only general-purpose ones. There are many other general-purpose and specialized search engines and some of them can be really helpful for reaching your target audience. You just can't imagine for how many niches specialized search engines exist – from law, to radiostations, to educational one! Some of them are actually huge sites that gather Webwide resources on a particular topic but almost all of them have sections for submitting links to external sites of interest. So, after you find the specialized search engines in your niche, go to their site and submit your URL – this could prove more trafficworthy than striving to get to the top of Google.
3. Paid Ads and Submissions
We have already mentioned some other alternatives to search engines – forums, specialized sites and search engines, search directories – but if you need to make sure that your site will be noticed, you can always resort to paid ads and submissions. Yes, paid listings are a fast and guaranteed way to appear in search results and most of the major search engines accept payment to put your URL in the Paid Links section for keywords of interest to you but you also must have in mind that users generally do not trust paid links as much as they do with the normal ones – in a sense it looks like you are bribing the search engine to place you where you can't get on your own, so think twice about the pros and cons of paying to get listed.
- Chapter 7: Personalization & Search Engine Rankings
Google Now Personalizes Everyone’s Search Results
Personalization & Search Engine Rankings
Years ago, everyone saw exactly the same search results. Today, no one sees exactly the same search results, not on Google, not on Bing.
Personalization & Search Engine Rankings
Everyone gets a personalized experience to some degree, even in private browsing windows.
Of course, there’s still a lot commonality. It’s not that everyone sees completely different results. Instead, everyone sees many of the same “generic” listings. But there will also be some listings appearing because of where someone is, whom they know or how they surf the web.
One of the easiest personalization ranking factors to understand is that people are shown results relevant to the country they’re in.
Someone in the US searching for “football” will get results about American football; someone in the UK will get results about the type of football that Americans would call soccer.
If your site isn’t deemed relevant to a particular country, then you’ve got less chance of showing up when country personalization happens. If you feel you should be relevant, then you’ll probably have to work on your international SEO.
Search engines don’t stop personalizing at the country level. They’ll tailor results to match the city or metropolitan area based on the user’s location.
As with country personalization, if you want to appear when someone gets city-specific results, you need to ensure your site is relevant to that city.
Ph: Personal History
What has someone been searching for and clicking on from their search results? What sites do they regularly visit? Have they “Liked” a site using Facebook, shared it via Twitter or perhaps +1’d it?
This type of personal history is used to varying degrees and ways by both Google and Bing to influence search results. Unlike country or city personalization, there’s no easy way to try and make yourself more relevant.
Instead, it places more importance on first impressions and brand loyalty. When a user clicks on a “regular” search result, you want to ensure you’re presenting a great experience so they’ll come again. Over time, they may seek out your brand in search results, clicking on it despite it being below other listings.
This behavior reinforces your site as one that they should be shown more frequently to that user. Even better if they initiate a social gesture, such as a Like, +1 or Tweet that indicates a greater affinity for your site or brand.
History is even more important in new search interfaces such as Google Now, which will proactively present “cards” to users based on explicit preferences (i.e. – which sports teams or stocks do you track) and search history.
Ps: Social Connections
What do someone’s friends think about a website? This is one of the newer ranking factors to impact search results. Someone’s social connections can influence what they see on Google and Bing.
Those connections are what truly matter because search engines view those connections as a user’s personal set of advisors. Offline, you might trust and ask your friends to give you advice on a restaurant or gardening.
Increasingly, when you search today search engines are trying to emulate that offline scenario. So if a user is connected to a friend and that friend has reviewed a restaurant or shared an article on growing tomatoes then that restaurant and article may rank higher for that user.
If someone can follow you, or easily share your content, that helps get your site into their circle of trust and increases the odds that others they know will find you. Nowhere is this more transformative than Google+, where circling a site’s Google+ Page will change the personalized search results for that user.
What do you plan on adding to your SEO to improve your personalization?
Google first introduced personalized search results in 2005 for signed in users with Google accounts. In 2009, personalized search was expanded to all users. However, new research on consumer sentiment on Google shows that 43.5% of respondents do not realize that their search results are personalized.
In this blog, I’ll cover the key factors you need to succeed with personalized SEO results as well as the best way to optimize.
In reality, it has been almost ten years since all search results were the same for everyone. There are a number of factors that influence a user’s search engine results page (SERP).
- Country: Location is a major factor. Users from different countries will see different results for the same search terms. One example often given here is that if a user searches for “football” in the U.K., they will be shown results that relate to what the United States refers to as soccer and the English Premier League. If a person in the United States searches for “football”, they will be presented with results that relate to American Football and the NFL.
- Locality: Google goes much deeper than simply country level. Results are tailored to the user’s location right down to local city level. Most people will be familiar with searches like “best pizza near me” making it somewhat surprising that our recent research showed nearly half of respondents were unaware that Google personalizes their search results.
- Web History: The goal of Google’s personalized search results is to provide users with the most relevant and useful information possible. By factoring previous searches and viewing history into account, Google can present users with results from their preferred sites which they are most likely to visit.
- Device: People who use Google to search on their mobile device will see different results than the same search on desktop. Google uses a different algorithm for mobile ranking with increased focus on user location.
One way personalization has changed SEO is that it makes keyword rank tracking more difficult. Personalized search results mean that tracking where your site ranks for keyword search terms is not always crystal clear. Depending on the personalizing factors we have outlined above, users will see different results for the same search terms. That means you can never really get a 100% accurate representation of your keyword ranking.
Rank tracking your non-personalized search can help you establish a baseline, and this data can help you see how changes on your site have impacted your ranking. However, it is not advisable to spend too much time obsessing over keyword rankings. Your site traffic is a much more important data point than your keyword rankings.
How to Optimize for Personalized Search Results
The fact that 43.5% of respondents to our survey didn’t realize their search results were personalized leads us to believe that many businesses are not optimizing their sites for personalized searches. There are some steps companies can take to give them the best chance of ranking today.
- Think Local: Make Google’s location settings work in your favor. Take care as you compose your meta descriptions and title tags. If you get onto page one of Google, you have one chance to impress so make it count by using your allotted characters wisely. If you want to dominate the local market, it is a best practice to put the name of your locality into your meta descriptions and/or title tags. You should also make sure that the most up to date and accurate business details are added to all online directories like Yelp and, most importantly, Google My Business (GMB). Claiming your GMB page is easy, but the business owner must claim it. Simply log on and verify your business. Remember to make sure your details are added to all online directories and not just the major players like GMB and Yelp. Bing Places for Business and others are low hanging fruit that can really help your site’s performance in local search rankings.
- Focus on Long-Tail Keywords: It is estimated that up to 80% of searches are long-tail keywords. The reason that SMBs should focus on long-tail keywords is that competition is lower so they are easier to rank for and user intent is much more targeted. By working out the long-tail keywords that can drive conversions and revenue, you can start to see real ROI from your SEO efforts.
- Mobile Optimize: It goes without saying that you need to optimize your site for mobile visitors in 2017 and learn to thrive in Google’s mobile search index. Having a low click through rate can harm your ranking. If you manage to get onto page one, you don’t want to undo all your good work with a site that does not meet user expectation on mobile.
How have you used personalization to appeal to your target audience? What do you plan on adding to your SEO to improve your personalization? Tell me about it in the comments!
Beginning today, Google will now personalize the search results of anyone who uses its search engine, regardless of whether they’ve opted-in to a previously existing personalization feature. Searchers will have the ability to opt-out completely, and there are various protections designed to safeguard privacy. However, being opt-out rather than opt-in will likely raise some concerns. The company has an announcement here. Below, a deeper look.
How Search Personalization Works
For those unfamiliar with how personalized search works, see my Google Search History Expands, Becomes Web History. It goes into great detail about how Google personalizes results.
The short story is this. By watching what you click on in search results, Google can learn that you favor particular sites. For example, if you often search and click on links from Amazon that appear in Google’s results, over time, Google learns that you really like Amazon. In reaction, it gives Amazon a ranking boost. That means you start seeing more Amazon listings, perhaps for searches where Amazon wasn’t showing up before.
The results are custom tailored for each individual. For example, let’s say someone else prefers Barnes & Nobles. Over time, Google learns that person likes Barnes & Noble. They begin to see even more Barnes & Nobles listings, rather than Amazon ones.
Of course, people will be clicking on a variety of sites, in search results. So it’s not a case of having one favorite that that simply shows up for everything. Indeed, Google’s other ranking factors are also still considered. So that person who likes Amazon? If they’re looking for a plumber, Amazon probably isn’t close to being relevant, so the personalization boost doesn’t help. But in cases where Amazon might have been on the edge? Personalization may help tip into the first page of results. And personalization may tip a wide variety of sites into the top results, for a wide variety of queries.
To personalize results, Google has to record what you’re doing — and that rings privacy alarm bells. Can people see what you’ve looked for? How long is the material kept? Can you just turn it off?
You can turn it off. A history is kept for 180 days. You can delete that history at any time, but even if you don’t, it can’t actually be viewed.
In particular, we now have two “flavors” of personalized search, or “Web History” as is the official Google name for it. There’s Signed-Out Web History and Signed-In Web History.
In Signed-Out Web History, Google knows that it has seen someone using a particular browser before. Behind the scenes, it has tracked all the searches that have been done by that browser. It also logs all the things people have clicked on from Google’s search results, when using that browser. There’s no way to see this information, but it is used to customize the results that are shown. It only remembers things for 180 days. Information older than that is forgotten. Google doesn’t know your name. If you use a different browser, Google doesn’t know your past history. In fact, you can’t even see your past history.
In Signed-In Web History, Google knows that a particular Google user is using Google. Behind the scenes, it has kept a record of all the things that person has done when signed-in, regardless of what computer or browser they’ve used. If they’re using the Google Toolbar with the page tracking feature enabled, then it has also kept a record of all the pages they’ve viewed over time. This information can be viewed by the user at any time, and the user can selectively delete info. They can also delete everything, if they want. If they don’t, then Google forgets nothing.
Let’s do a chart:
|What’s recorded||What you click on in search results||What you click on in search results & pages you visit, if Google Toolbar tracking feature is specifically enabled|
|How long is data kept?||180 days||Forever, or until user deletes it|
|Can you view search history?||No||Yes|
|Can you opt-out permanently?||Yes||Yes|
Can’t View History
An important aspect to the change is understanding that there’s no way for you — or anyone — to see what you’ve searched on or clicked on in the past, if you’re using the signed-out version of web history.
Google Now Notifies Of “Search Customization” & Gives Searchers Control goes into much more depth about how last year, Google began notifying searchers if it changed their results based on their previous query. Clicking on the notification would show the previous query, which might be embarrassing or worse if you left your computer and someone else saw it. To limit exposure, only the last 30 minutes of previous query information was shown.
With the change, Google’s storing much more than the last 30 minutes of previous history. However, that’s not being shown.
Let’s do some pictures. Here, I’ve done a search for spain:
Notice the arrow pointing to Web History. This is effectively a default notification that results are being logged for personalization. Clicking on it leads to a notification page that in turn allows for opting-out.
Now here’s another search I did right after that, for travel:
Notice I’m pointing at the “View customizations” link that has now appeared. This is another notification, an explicit one where Google’s saying effectively “Hey! You searched for ‘travel,’ but I’ve altered the results I’ve shown you based on things I know about you personally.”
So what’s Google know? In this case, if you click on the link, you get shown:
I’ve highlighted the key part. Google’s saying that it used your search history to alter this. Almost certainly, this means it saw I had just search for “spain,” and so added that word to the query “travel.” In the past, it would have told me this specifically. But now that data is being kept longer, it’s not showing any previous query or past search history material.
This Freaks Me Out!
Don’t like the idea of your searches being recorded, even if you’re not logged in? Keep in mind a few things for perspective:
- All the major search engines have long recorded what you search on. Google’s simply using it to refine your results, in addition to what the others do, show ads
- Your browser itself records what you search on — and often, people fail to clear their browser histories.
- You don’t have to use it.
Remember I mentioned that opt-out page? Let’s see what it says:
See the link I’ve pointed at? Click on that, and you’ve turned off logging for personalization purposes. Google will no longer keep track of what you’ve searched on in the past, in association with your browser, in order to perform personalization. In addition, Google remembers that you don’t want to be logged in the future. For the technically inclined, this is nice. It means you can have a Google cookie that knows you don’t want to be logged, rather than having to access Google without a cookie at all.
Note that even if you opt-out, Google will still be logging what you search on as it always has done. It just won’t personalize using that information. And after 180 days, even this logged-but-not-used information is deleted automatically (see Anonymizing Google’s Server Log Data — How’s It Going?for more about this).
Change your mind? Click on that Web History link I mentioned earlier. It will oddly still show, even if you’ve opted out and nothing is being logged (plus, “Web History” is a bad name, since for signed-out users, it’s not really tracking what you do on the web). Click Web History, and you can enable custom search.
What About Diversity?
Interestingly, I’ve spoken on the subject of Google’s preexisting search personalization feature three times over the past week, and each time, a key question has arisen. If Google rewards the sites you like, does that mean eventually you’ll only see stuff you like? Would a conservative see only conservative web sites? A liberal see only liberal web sites?
No, Google says. Annoyingly, the company will not give any metrics about what percentage of results a typically searcher gets back that are personalized in some way nor the percentage of the results themselves that are changed. IE, are 85% of queries personalized? And if you get a page of personalized results, are 20% of the links on that page personalized? I couldn’t get any such figures.
However, Google did say it want to keep some results similar between users:
“We want diversity of results,” said product manager Johanna Wright. “This is something we talk about a lot internally and believe in. We want there to be variety of sources and opinions in the Google results. We want them in personalized search to be skewed to the user, but we don’t want that to mean the rest of the web is unavailable to them.”
- Chapter 8: Social Media & Ranking In Search Results
While most people know that social media is a big deal in content marketing, not everyone understands how social media affects organic search results. And understandably so: the topic isn't exactly cut-and-dry. The fact of the matter is that, while social media is an important ranking factor for search results, it's likely that the process doesn't work exactly the way you think it does.
Contrary to what many people believe, Google doesn't actually take social media directly into account in its ranking algorithm, but it does look at certain social signals to decide how to rank websites. Here's what you need to know.
Google & Social Media: How Do They Play Together?
Google and social media platforms like Facebook and Instagram have had a long and complex relationship. Like Brad and Angelina, they're on-again and off-again. While social signals have long been used in Google algorithms, it's not so much actual social media accounts that affect the algorithm as it is the results of those accounts. Here's what I mean:
There are millions of social media profiles in the world and if just having one was enough to rank well in Google, everyone would be on the front page of Google's SERPs. Which would be great except for the fact that anywhere between 5%-11.2% of all Facebook profiles are fakes. This is one of the major reasons that simply having a social media profile isn't enough to rank well in Google's SERPs.
On that same token, however, there are millions of authentic social media profiles around the globe, which means that Google can't afford to overlook them completely.
Because of that, Google evaluates social signals rather than social profiles when ranking pages.
Here are a few of the things the search engine looks at:
- How many shares content created by a given brand/individual earns
- How useful, informative, and valuable the content is
- How relevant the content is to a specific base of readers
The reason that Google looks at these things rather than the simple existence of a social profile is because Google's entire mission in search is to provide users with relevant, high-quality search results.
While the existence of a social media profile for a brand or individual isn't necessarily indicative or quality or authority, the presence of plenty of highly-shared, liked, and talked-about content is.
Google+ & SEO
While there are dozens of social media platforms, such as Facebook and Twitter, that aren't directly affiliated with Google, there's one that is: Google+.
Google+ is a social network designed by the search engine giant itself and it may count a bit more for SEO than some of the aforementioned platforms. Because Google+ pages rank within Google just like website pages, the social network is an effective way for marketers and individuals to rank for keywords and phrases contained within a post.
Google recently redesigned how their G+ appears, too, here's a screenshot taken March 2016:
The reason Google+ can be so powerful for SEO is that articles shared on the platform are subject to DoFollow links. These links boost the link equity of the website in question and facilitate the core functions of SEO. Because of this, Google+ provides slightly more SEO power than some other social media platforms.
5 Tips to Use Social Media to Improve SEO
While social media isn't the end-all-be-all of social media, it can certainly be a helpful outlet for companies who need a boost. Here are five tips to help you get started:
1. Boost your followers
While having thousands of Facebook friends may well earn you some extra SEO juice, there are some caveats to this. First of all, Google's smart enough to distinguish between legitimate followers (interested audiences trying to keep up with a brand or individual) and spam followers (purchased accounts traceable to "click farms" in India). With that in mind, purchasing hundreds of followers or having dozens of spammy, fake social media accounts following you won't ultimately help your SEO standing and may, in fact, hurt it.
Social media platforms have been cracking down on fake profiles of late. A great example was 2014's so-called "Instagram Rapture," wherein celebrities like Justin Bieber lost millions followers in a 24-hour period when the social media platform went through and purged its fake accounts. There's an important lesson to be learned from this: while having tons of followers can boost your SEO, it's important to make sure that you're focusing on quality first and quantity second.
It's also important to remember that quantity isn't something that happens overnight. Growing your follower base is a slow process but companies and individuals who put out quality content, interact with followers, and offer helpful tips will eventually get there. As your followers begin to grow, your content will enjoy a wider reach and Google will begin to realize that the popularity of your social media profiles is a good indicator of the quality of your brand as a whole.
2. Earn inbound links
While many people think about social media and search engines as separate entities, few people remember that social media is a search engine of sorts. The majority of consumers use social media to search for their favorite brands and individuals and, when they find content they love, they share it with their friends on other web-based and social media platforms.
Brands that create content that can be located through the search bars of social media platforms are setting themselves up well for plenty of external links to that content, which is one of the many ranking factors Google looks for when evaluating websites. While external inbound links are important, marketers need to remember that, once more, it's the content that counts here more than the vehicle (the social media account). By using things like hashtags to popularize content and interacting and engaging with comments and feedback the content receives, marketers can ensure that their social media content earns all of the inbound links it deserves.
3. Create "anchor" content for social media
Unless you're great at witty status updates and Tweets, it's unlikely that your profile updates are going to go viral. Companies that create "anchor" content, however, have a much better chance of being shared around the web. Anchor content is weighty, relevant content formats like infographics, videos, long-form articles, or podcasts. This content serves as the foundation for an extensive message that is valuable to many people.
By creating anchor content and distributing it on your social platforms of choice, you can drive traffic back to other sites and improve your social search signals enough to facilitate a boost in SEO.
4. Encourage sharing
Social media is meant for sharing and one of the best ways to improve your ranking is to facilitate social sharing as much as possible. Keep in mind, however, that it's not enough to simply put content out there and hope that people share it. Instead, you need to ask for shares directly. Some brands do this by offering a reward or incentive for readers who share a post and others set up social surveys in which people "like" a post if they agree with an issue at hand. As people engage with incentives like this, the content pops up on more and more news feeds and the brand behind the strategy earns more followers. This, in turn, influences social signals and helps you get the SEO boost you deserve.
5. Optimize your posts for local SEO
Social media is a powerful tool for any small, local business interested in earning more customers. When you optimize your social media profiles, posts, and pages for local SEO, you can better establish yourself in a local community while also positioning yourself correctly for enhanced local search results. Do this by linking all of your in-house marketing and promotional material to your social media profiles, inviting customers to leave a review or a comment there, and running promotions on your social media pages.
You should also ensure that the content and information you post on your social pages includes information about your company's niche and the geographical area in which you work. This will help you enhance your local search prowess and become more visible in local SERPs.
The Verdict: Social Media Does Affect SEO
While social media certainly isn't the only thing that Google looks at when deciding how to rank your site, it's one of the more important. While spammy, black-hat social media tactics like buying followers will get you nowhere, companies and individuals who build expansive, varied, relevant social media presences will enjoy an SEO boost as a result.
As is true in most of content marketing, content is the most important aspect of social media and people who create great content and succeed in getting it passed around the web will enjoy all of the SEO benefits that social media has to offer.
The Role of Social Media in Rankings
Social media is too powerful a tool for brands to ignore, and unless there’s some seismic shift in the attitudes and habits of users, its value will only continue to grow over the coming years. That being said, it’s unclear exactly how search engines use social media to rank your website.
While Google has flip-flopped on their algorithm’s use of social signals as a factor in search rankings, the official word for now is that social signals (likes, retweets, pins, shares, follows, etc.) don’t factor into search results.
That being said, social is still a very powerful SEO tool, even if social signals don’t directly impact your search rankings.
Activate Your Brand Community to Reinforce Relevance
Though social may only indirectly affect your rankings, one of the ways it does is through the power of your community. Search engines factor in the size of your following on social media as a measure of your authenticity. After all, if you only created phony clickbait content, you wouldn’t have been able to build a large community of followers in the first place.
Search engines take into account the size of your social media following.
Do fake users count? In short, no, because fake users can’t engage with your content. Real users who click through from your social media accounts to your website or blog and share your content will ultimately boost critical website stats. As Google tries to determine authenticity and content value, it can look at that traffic as concrete proof that you’re doing something right.
These three quick tips will help you craft social media posts that boost clicks to your website.
- Cut Down on Word Count
While the “ideal” length of copy varies from platform to platform, Neil Patel at Quick Sprout found that generally speaking, less is more when it comes to your social media copy. In fact, his research shows the click-through rate on paragraph-long Facebook copy is 2-6 times lower than updates that are approximately 40 characters long.
Try to keep your social media posts to around 40 characters.
Rather than using extended copy in your social media posts, opt for a lower word count. Use all of the words you need, and not one more.
- Tell People What You Want Them to Do
Sometimes, all your audience needs is a simple request so they know how to better interact with your content.
A quick search of the most clickable words on social media prove that most lists have one thing in common: they’re full of call-to-action phrases. Use words and phrases like “check out,” “see,” “retweet,” “like,” and “follow” to get people to engage with your social media content.
Add a call to action to tell your audience what you want them to do.
Don’t leave clicks, conversions, or shares to chance. Use a clear, concise directive when you want viewers to do something.
- Give People a Reason to Click Through
It’s Youtility principle number one: If you want to connect with your audience, don’t sell to people; help them.
Make sure the content you share on social media answers a problem or pain point that’s common to your customers and prospects. Use the copy in your social media posts to refer to both the problem and the solution, and people will happily click through to your website!
Help your readers solve a problem.
Treat Social Media Like a Search Engine to Increase Discoverability
What about your social content itself? Don’t your social posts have some value as backlinks? Unfortunately, links you share on social media don’t add directly to your website’s search rankings. Once the posts are published, they receive a “nofollow tag” (a code that alerts search engines to ignore the origin).
However, because social media platforms rank search results on keyword strength and popularity, your social presence should be crafted with SEO in mind. Here are two ways you can leverage social SEO for the same reason you use traditional SEO: to help people search for and discover you and your content.
- Optimize Profiles for Keyword Searches
First, identify your foundational SEO keywords and then include them in the headlines, summaries, links, and bios of each of your social media profiles. You can also use those keywords in the captions of relevant images you post.
People searching Facebook or another network might happen across your company because of the foundation-level keywords built into your page. From there, they can be drawn into the funnel of your inbound strategy.
Just take a look at the top two results when you search for “digital marketing” on Twitter:
Use relevant keywords in your social media profiles.
From the bios alone, it’s obvious that Jay Baer and his team have been very intentional about weaving the “digital marketing” keyword into these two profiles, and it has paid off in their Twitter search rankings.
- Optimize Your Copy With Keywords
From a single hashtagged post, a follower might find your profile, more of your similar content, and click through directly to your website. Use carefully selected keywords in your posts, links, and hashtags to help elevate your rank in social search results and add to the visibility of your brand.
Use Social Media Content to Amplify Your Online Footprint
The power of content lies in its ability to reach beyond established consumers and find new audiences and potential customers. Not only is content shared on blogs and other websites, it ranks on search engines as well. So naturally, your social media content can contribute heavily to your SEO.
- Use Keywords to Drive Content Creation
To boost your visibility on search engines across the board, make creating social media content designed to rank for specific keywords part of your marketing strategy.
For example, check out how these chefs have created YouTube content to show up in search engine results for a query related to their expertise.
Design content that will rank for specific keywords.
- Publish Social Posts Worthy of Backlinks
Social media posts can end up in all sorts of places beyond traditional social networks. They’re being used in blog posts and articles, ebooks, and SlideShare presentations.
If you create social media content other people want to reference, you’ll gain backlinks to your social media profiles and website, effectively increasing your authority with search engines.
This article from Greatist, “55 Must-Follow Twitter Accounts Guaranteed to Make Your Day,” is the perfect example.
Your social media posts could end up in places outside of social channels, such as blog posts and articles.
Time for You to Unlock the Relationship Between Social and SEO
It appears inevitable that eventually social media will tie directly to every website’s search rankings. For now, your social media should act as an extension of SEO; an inbound marketing tactic that directly impacts your reach, popularity, brand identity, and influence online.
Consider the rules of SEO when crafting your social content. If your primary goal is to help your audience find your business, you should be creating messages that make you stand out from the crowd. Keep your content smart, brief, and easy to enjoy at a glance. Most of all, make it irresistible!
Does Social Media Affect Your Rankings in Online Search?
A Bit of History
As I mentioned above, Cutts’ statement that Google does not look at social signals when determining the rank of a webpage came as a big surprise to the online marketing industry. After all, in a video published in December 2010, Cutts himself said that social signals were a factor in ranking.
In this video Cutts refers viewers to Danny Sullivan’s Search Engine Watch articlefor which Sullivan had spoken directly with Bing and Google in order to learn how the two search engines look at social signals as a ranking factor.
Both search engines told Sullivan that who you are as a person on Twitter can impact how well a page does in regular web search.
A variety of studies, including SearchMetrics’ Rank Correlation for 2013 and the case studies outlined in this infographic from Quicksprout, gave additional weight to the idea that search engines look to social signals when ranking a webpage. So you can understand why marketers were dismayed and a little annoyed when, three years later, Google told them nope, sorry guys, we actually don’t look at that stuff right now.
Despite all this back-and-forth, Neil Patel, SEO expert and founder of Quicksprout, recently urged marketers not to discount social’s impact on SEO too quickly; he thinks that social is the new SEO, and his argument is pretty convincing.
Why does Patel think that social is the new SEO, and how are other marketers integrating social into their SEO strategy? I dove into researching this subject and identified 5 key things every marketer should know about how social media impacts SEO in 2014 (going into 2015).
My research also left me with a few questions, which I mention throughout the post; I’d love to hear your thoughts in the comments below!
5 Things to Think About When Considering The Impact of Social on SEO
1. Social Links May or May Not Boost Your Search Rank
Okay, social signals pertaining to a profile’s authority are out, but does Google consider links published on social accounts to be credible backlinks? When a blog post goes viral on Twitter, do those new links boost the post’s search ranking?
Many marketers believe that links to your website via social media accounts do have a major impact on your rankings. Says Marketing Consultant Brian Honigman:
Today, links are mainly achieved through developing original content that is in turn, shared across social media. Links to your content on Facebook, Twitter, LinkedIn, Google+, YouTube and other social networks help the search engines understand what websites are credible and should be ranked for what keyword phrases.
In Danny Sullivan’s 2010 interview with Google and Bing for Search Engine Watch, Google first says that it doesn’t incorporate the number of times a link has been tweeted into their search rank algorithm, and then it goes on to say that it does (doh). Bing says that it definitely looks at this data:
We take into consideration how often a link has been tweeted or retweeted, as well as the authority of the Twitter users that shared the link.
While Cutts’ 2014 video is crystal-clear about the absence of social signals from the search algorithm, he does say that Google crawls social websites for data in the same way that it would any other site:
Facebook and Twitter pages are treated like any other pages in our web index, and so if something occurs on Twitter or occurs on Facebook and we’re able to crawl it then we can return that in our search results.
This leads me to think that while the authority of a social account doesn’t impact search rank, links published on social media could be marked as credible back-links and thus influence a page’s rank.
Takeaways: When Cutts made his statement about Google not factoring in social signals I understood him to mean clues about a particular company’s authority on social media, which, for me, is distinct from the number of times a page has been linked to on social media. Further research didn’t help me get much clarity on this point.
If there are any SEO experts reading this, I’d love for you to chime in below in the comments.
2. Social Media Profiles Rank in Search Engines
While social shares may or may not affect a webpage’s position in search listings, your social profiles definitely influence the content of your search results. In fact, social media profiles are often amongst the top results in search listings for brand names. When I searched “General Electric” in Google, the company’s Instagram and Pinterest profiles appeared as the 5th and 6th listings, respectively, and Twitter was the 8th result.
Moreover, Google displayed the company’s Google+ profile information in the right-hand sidebar at the very top of the search results page.
Social channels can feel more personal than webpages, and they’re a great way to get a sense of a company’s personality off the bat. When I’m researching a company I don’t know much about I typically go straight to their Twitter or Facebook page. So if a social account shows up at the top of the search results, I’m just as likely to click on it as I would be to click on their website.
Takeaway: There’s no doubt that your social profiles matter to Google and especially to people who are looking for you online. A few active social channels can make the experience of getting to know your brand online more fun, engaging and personal. Also, while some may consider Google+ a non-essential social channel, marketers shouldn’t discount the fact that a company’s Google+ profile is one of the first things a searcher will see (and potentially click on). As such, it pays to have a profile with up-to-date info and engaging content.
3. Social Media Channels Are Search Engines, Too
Nowadays, people don’t just go to Google and Bing to look stuff up; they also use social media channels to find what they’re looking for. Patel makes this point in his article on why social is the new SEO: “We need to understand that search engine optimization includes the search that happens on social media search engines.”
This works in a couple of ways: First, if you’re active on Twitter, it’s entirely possible that people will discover your company’s new content distribution app after searching for content marketing-related tweets with Twitter’s search engine. Likewise, brands that lend themselves to beautiful visual content can benefit from making their content visible in Pinterest and Instagram by using hashtags and properly categorizing their pins.
Moreover, as mentioned in point #1, if someone wants to check out your company, they’re likely to open Twitter and Facebook and do a quick search to see what kind of presence you have on each channel. YouTube, and, of course, Google+ are also search engines.
Here are some impressive stats that illuminate just how much people are using social media to search:
- As of 2010, Twitter handled 19 billionsearch queries a month (that’s more than 5x the queries handled by Bing!).
- In 2012 Facebook said it got around one billionsearch queries per day.
- As of March 2010, YouTube got roughly 7 billionsearch queries a month. Also, 100 hours of video are uploaded to YouTube every minute, making it one of the largest content repositories on the web.
Takeaways: Companies should expand their concept of SEO to include not just the traditional search engines––Google and Bing––but also social search engines.
When searching for a brand on Facebook or Twitter it’s not uncommon to see several different profiles pop up, and it’s not always clear which one is the real deal. Marketers need to ensure that it’s super easy for users to identify their official social profiles.
This may mean deleting duplicate accounts and/or clearly labeling each social account so that users understand what purpose they serve (for example, accounts for HR or press versus general brand pages).
4. Not Now Doesn’t Mean Not Ever
Just because Google says that social signals don’t currently impact search rank doesn’t mean they never will. Social media shows no sign of becoming a less important part of a brand or person’s online presence anytime soon; moreover, given that link-building strategies like guest blogging have become a less reliableway to indicate the quality of a webpage, it makes sense that search engines would begin to look for other signals of authority and value.
Takeaways: There’s no reason why social signals won’t begin to affect search rankings in the future, so smart brands will continue to build their authority in key social channels and think about social when designing their SEO strategy.
5. Don’t Forget Bing
Google may have back-tracked and changed their stance on social signals, but I haven’t found any evidence that what Bing told Sullivan for his Search Engine Watch interview doesn’t hold true today.
Remember, Bing said:
We do look at the social authority of a user. We look at how many people you follow, how many follow you, and this can add a little weight to a listing in regular search results.
Takeaways: Bing, which is the second most-used search engine, has been crystal clear about how their algorithm incorporates social signals into their search results, and, unlike Google, they haven’t flip-flopped on the issue. With its market share steadily growing, companies would be wise to include Bing in their SEO strategies.
Cutts’ claim that Google’s search algorithm ignores social signals should not be seen as an invitation for marketers to dismiss social’s impact on SEO. Instead, marketers should broaden their concept of search and SEO to take into account the myriad ways that people find content on the web. They also need to think about the positive effects that increased traffic from social can potentially have on their search rankings as well as the prominence of social profiles on first-page search results.
Ultimately, the web is all about building relationships, fostering audiences, expressing identity and sharing ideas––it’s inherently social, and there’s no reason that SEO best practices would go against the grain, especially since the rules that govern SEO are ultimately meant to make the web a more enjoyable and useful place.
Okay, your turn: How else do you think social affects SEO?
- Chapter 9: Violations & Search Engine Spam Penalties
Violations & search engine spam penalties
So far, we’ve discussed the positive signals that make up the Periodic Table Of SEO Success Factors. But there are also some negative factors to avoid.
A word of reassurance: Very few people who believe they’ve spammed a search engine have actually done so. It’s hard to accidentally spam and search engines look at a variety of signals before deciding if someone deserves a harsh penalty.
That said, let’s talk about things not to do!
Vt: ‘Thin’ or ‘shallow’ content
Responding to a drumbeat of complaints about poor search results, Google rolled out its “Panda” update in February 2011. Panda targets what is described as “thin” or “shallow” content or content that is lacking in substance.
This domain-level penalty targets sites with a predominant amount of so-so content and essentially treats it similarly to the way it treats overt spam techniques.
Today, it’s no longer a question of whether the content is simply relevant, but also whether it is valuable to the user.
To learn more about this, see some of our articles in the category below:
- Google: Panda Update
Let’s talk sophisticated hiding. How about rigging your site so that search engines are shown a completely different version from the one humans see?
That’s called cloaking. Search engines really don’t like it. It’s one of the worst things you could do. Heck, Google’s even banned itself for cloaking. Seriously.
While most people are unlikely to accidentally spam a search engine, the opposite is true when it comes to cloaking. That’s why there’s such a heavy penalty if you’re caught doing it. It’s a bait-and-switch, and it’s seen as a deliberate attempt to manipulate search results.
- SEO: Cloaking and Doorway Pages
Vs: Keyword stuffing
It’s one of the oldest spam tactics on the books, yet it’s still being used, and the search engines still don’t like it. Search engines say to use words you want to be found for on your pages. OK, I’ll give them those words over and over again! How about 100 times. In a row? That work for you, Google?
Actually, no, it doesn’t. That’s “keyword stuffing,” and it could get you penalized.
How often is too often? There’s no correct answer here, but you’d really have to go to extremes to cause this penalty to kick in. It’s most likely to happen to non-SEOs who just don’t know better and might decide to paste a word many times in a row, typically at the bottom of a web page.
Vh: Hidden text
Once you decide to keyword stuff, your next thought will probably be “Why don’t I hide all this text that no human wants to see?” You might make the text white, so it blends with a page’s background. In doing so, you will have spammed a search engine.
Search engines don’t like anything hidden. They want to see everything that a user sees. Don’t hide text, whether by using styles, fonts, display:none or any other means that so a typical user can’t see it.
Vd: Piracy/DMCA takedowns
The “Pirate” update targeted sites infringing on copyright law. Under pressure from the Recording Industry Association of America (RIAA), Hollywood powerhouses and governments, Google began to penalize sites that received a large number of Digital Millennium Copyright Act (DMCA) “takedown” requests.
It’s unlikely that most sites will have to deal with these issues, but you should handle any DMCA takedown notifications that show up in your Google Search Console account.
Learn more about the Pirate update and piracy in the following categories:
- Google: Pirate Update
- Legal: Copyright
Va: Ads/Top Heavy layout
Have you ever been on a site and found it hard to find the actual content amid a slew of ads? Where’s the beef!
That’s what the Page Layout algorithm was meant to address. Often referred to as Top Heavy, this penalty is reserved for sites that frustrate the user experience by placing an overabundance of ads before content. So don’t make your users search for the content.
Learn more about the Page Layout algorithm from the following category:
- Google: Top Heavy Update
Intrusive interstitials are also an issue that Google has warned against and taken action over:
- Google’s App Interstitial Giant Ad Penalty Is Now Live
- Google confirms rolling out the mobile intrusive interstitials penalty
Vp: Paid links
Speaking of Google banning itself, it also banned Google Japan when that division was found to be buying links. For 11 months.
That’s longer than J.C. Penney was penalized (three months) in 2011. But J.C. Penney suffered another penalty after having its paid link purchase splashed across a giant New York Times article. So did several large online florists. And Overstock got hammered via a Wall Street Journal article.
The debate over whether Google should act so aggressively against those who buy and sell links has gone on for years. The bottom line is that to rank on Google, you have to follow Google’s rules — and the rules say no buying or selling links in a way that passes on search engine ranking credit.
If you choose to ignore Google’s rules, be prepared for little mercy if caught. And don’t believe programs that tell you their paid links are undetectable. They’re not, especially when so many of the cold-call ones are run by idiots.
As for Bing, officially, it doesn’t penalize for paid links, but it frowns on the practice.
The following category has posts with more information about paid links:
- Link Building: Paid Links
Vl: Link spam
Tempted to run around and drop links on forums and blogs, all with highly optimized anchor text (like “louis vuitton handbags 2013”), with the help of automated software?
You’re also not doing SEO, though sadly, all the people who hate the spam you leave behind get the impression that’s what SEO is about. So SEOs hate you too – with a passion.
If you do go ahead with it, most of the links won’t give you the credit you were thinking they would. On top of that, you can find yourself on the sharp end of a penalty.
This penalty has been given more weight in this version of the table based on the efforts Google has made in neutralizing and penalizing link spam and, in particular, the launch of the “Penguin” update.
If you’ve been caught dabbling on the dark side, or if a fly-by-night “SEO” company got your site in hot water, you can disavow those links on both Google and Bing in hopes of redemption and a clean start.
More info & redemption
To learn more about spam, you might check out this category: If you’re seeking redemption, here’s guidance from Google on how penalties are applied or removed and how to request reinclusion:
- SEO: Spamming
- Google: Penalties
- Google: Link Disavow
- Google: Penguin Update
- Bing reinclusion request form.
Google penalties and messages explained
Google reserves the right to apply manual spam actions, better known as penalties, to websites it finds in violation of its Webmaster Guidelines. The specific reasons and scope of manual penalties can be manifold, and the actual impact may range from barely noticeable to utterly disastrous for a website’s presence in organic Google Search results.
This regularly updated guide — written by a former senior Google Search Quality team member and SEO consultant — describes what types of penalties currently exist, demystifies Google’s messaging and explains how to go about successfully getting a Google manual penalty removed.
About manual penalties and this guide
Since 2012, Google has scaled up their efforts to communicate with webmasters via Google Search Console, previously known as Google Webmaster Tools, about website issues which are likely to negatively impact upon how visible a site ends up being in organic Google Search for relevant user queries.
This guide focuses on how you should interpret and respond to these notifications, which Google euphemistically calls “warnings,” many of which are related to Google Webmaster Guidelines violations — black-hat techniques spotted by the Google Search Quality team that were deemed egregious enough to trigger sanctions.
But notifications aren’t solely about the employment of black-hat techniques. We’ll also explore some warnings about issues that could be considered sins of omission, in that the site owner has failed to secure the site — allowed it to host spam or be hacked — or has failed to implement structured data markup correctly.
Google has also begun bringing webmasters’ attention to technical issues it identifies. While these may equally impact a site’s visibility in organic Google Search, they are not related to any Google Guideline violations and will be omitted here. That having been said, any piece of information highlighted in Google Search Console should be considered important and be taken seriously.
All sample messages discussed in this guide have been spotted “in the wild” within the last 24 months as of this writing. Older messages, not received in years, are not included in the manual penalty overview. All sample screen shots have been adapted to highlight the most relevant pieces of information that provide guidance on how to resolve the existing issue.
On-page guideline violations & related notifications
This set of violations and notifications apply to problems that have been identified on a site, which are directly under a site owner’s control.
- Major and pure spam problems
- Spam problems
- User-generated spam
- Hacked content spam
- Incorrect structured data
- Unnatural outbound links
Off-page guideline violations & related notifications
While a site owner may theoretically be unable to control what other places link to the site, black-hat practices like buying links or spamming other sites have led Google to be concerned with off-page issues as well.
- Unnatural inbound links
Reconsideration requests & related notifications
If you’ve received a manual penalty and have made a good faith effort to fix the issues that triggered it, you can request for Google to review your site so that you can have the penalty lifted. This is what is known as a filing a reconsideration request.
Any time you receive a manual action notification, it should outline all the steps you should take to rectify the problem; these steps will vary depending on the specific penalty that has been issued. Once you’ve satisfied all of the requirements outlined by Google, the final step should contain a “Reconsideration Request” button that will initiate the process once clicked.
As part of the reconsideration request process, you may need to provide the Google team with documentation outlining the steps you have taken to bring the site into compliance with Google Webmaster Guidelines. This will help build a case for why the manual action should be lifted.
Once you’ve fixed all your website issues and submitted a reconsideration request, you may receive one of the following notifications in Google Search Console:
- Disavow file updated notification
- Reconsideration request (submission confirmed)
- Reconsideration request rejected
- Reconsideration request processed
- Reconsideration request approved
- Violations and Search Engine Ranking Penalties
In previous chapters we have covered the positive search engine ranking factors that search engines use to rank your page in their results. Now it is time to cover the search engine ranking penalties that could negatively affect your web site.
Don’t worry. It’s pretty hard to run afoul of the various search engines without actively trying to deceive them.
Now, lets look at some of the bad SEO techniques.
“Thin” or “Shallow” Content
Having thin or shallow content is a new penalty. Until recently Google couldn’t even decide if having shallow content was as bad as actively trying to spam the search engine.
But last February, Google made an update to their search engine ranking algorithm and stated, with this update, thin or shallow was now a reason to penalize a web page in its search results.
This was the original search engine optimization technique and the search engines caught onto it quickly.
Keyword stuffing is the practice of repeating words or phrases an unnatural number of times on a single page. The idea was, by repeating these words, you could trick the search engine into thinking a page was more relevant to the targeted topic. This practice obviously doesn’t work anymore.
So the next quest is, “How many times is too many.” There’s no correct answer except use common sense. The good news is that you would have to go out of your way to be subjected to this penalty.
If you have already decided to include a bunch of text on your page to trick the search engines your next step might be to hide that text from the human reader. For example, you might set the offending text to the same color as the background of your page. This is search engine spamming.
Seeing Google can see all the code that makes up your web page, they can tell if you are hiding text and penalize you for it. Don’t try to hide text using styles, fonts, or display: none.
So hiding doesn’t work because the search engine can see the code used to hide the offending text, what can be don about that if we are being evil? How about returning a different page to the search engine spider then a human visitor would see. That is called cloaking.
Search engine really hate this. If you are caught doing this the penalty can be harsh.
The act of buying and selling links is forbidden by Google. This is because links pass on search engine ranking credit and being able to buy your way to the top would pollute the results thus, affecting Google’s search result quality and hurting their business. Google penalized JC Penny for three months for buying links.
If you choose to ignore Google’s rules, be prepared for little mercy, if you’re caught.
If you can’t buy links maybe you can run around the web and post them yourself. Maybe automate the process with some type of application?
Unfortunately there is nothing stopping you. It’s just the Internet equivalent of graffiti and frowned upon by legitimate professionals, SEO Services and the search engines.
The work probably won’t help your site much either. Most blog use nofollow links in their comment sections so they don’t pass any page rank juice on. The rest probably have low page ranks to begin with and thus there is no page rank juice to pass on.
tarting in 2011 both Google and Blekko started using blocking as a search engine ranking factor to adjust their search engine ranking results. No doubt that Bing won’t be far behind.
Blocking allows anyone who doesn’t like a search result from a site to block that site from ever appearing again in their search results. You won’t be notified that this has happened. Your web site will just never turn up in the results for that user again.
So what. A single user blocking your web site isn’t the end of the world. That maybe true but, Google is now using the aggregate blocking data to help rank their results. A site that is blocked by a lot of users will have its rank lowered.
This is pretty easy to avoid. Provide quality content and don’t try and create pages to spam the search engine.
The Big List of Google Penalties for SEOFOR SEO
Every business wants to achieve the highest possible listing on Google. But using questionable techniques to boost your rankings can often end up backfiring.
Violating Google’s Webmaster guidelines will result in penalties that reduce your website’s rankings or completely remove it. To avoid spending time in the Google penalty box, steer clear of the following penalties.
The Big List of Google Penalties for SEO
UNNATURAL LINKS TO YOUR SITE (IMPACTS LINKS)
When Google detects unnatural links that have been artificially created by you or someone else on your website, it will penalize the unnatural links by devaluing them. In this case, Google says clean up the links if you can, but it shouldn’t hurt your website (they are a little unclear on this one unfortunately).
Here are a couple good resources on this one.
- Unnatural links to your site—impacts links – Moz. March 18, 2014
- 3 Unnatural Link Warnings and How to Deal With Them – Search Engine Journal. March 16, 2015
UNNATURAL LINKS TO YOUR SITE
These are links to your site that Google thinks you have control over. As a result, they will penalize your entire website as well as the links. This is a tough one, read about it here.
UNNATURAL LINKS From YOUR SITE
This penalty is applied to the web site on the other end of any unnatural links to your site. If you are selling links, this could be you.
If Google thinks your site has been hacked, it will be penalized.
THIN OR SHALLOW CONTENT
Increasingly, Google algorithms put a premium on quality of content. Which means the content on your site must be relevant and valuable to readers. Thin or shallow content will be considered spam, and treated accordingly.
Google hates spam. Websites crammed with filler text, cloaked pages, scraped content or other gibberish will draw a quick penalty.
Typically applied to websites where users create the spam, such as forums, this earns the same penalty as user-generate spam.
Excessive use of keywords can get you in trouble. Use too many keywords now and the Google “refs” will throw the yellow flag and penalize your site. You can still use keywords multiple times on a page. Just make sure they make sense within with the context of your message.
Search engines are like accounting auditors – they want to see everything out in the open, with nothing hidden. This is true even with text that your readers may not want or need to see. You can hide text in many different ways, such as making the font the same color as the background. Regardless of the technique, Google will consider it spam and may penalize your site accordingly.
When Google detects spam on multiple websites hosted by the same host, they usually penalize all the websites.
Google’s rule on this one is crystal clear – no buying or selling links in a way that passes on search engine ranking credit. Google dislikes this practice so much that they have banned companies like J.C. Penny and Overstock for months at a time. Paid links may seem like a cost-effective way to improve search rankings. But get caught doing it on Google and you’ll pay a heavy price. (Hint: don’t believe programs that tell you their paid links can escape Google detection.) You can report paid links here.
AUTOMATICALLY GENERATED CONTENT
Are you having a program create content on your website?
Are you redirecting the user to a piece of content that is different than what they search for? Learn more.
Did you create a bunch of pages that provide no value, but are intended to rank for keywords? Learn more.
Are you stealing content from other websites and re-purposing it as your own? Learn more.
Are you only listing information about affiliate programs on your website? Learn more.
CREATING PAGES WITH MALICIOUS BEHAVIOR
Are you installing content on someones computer, changing home pages or pushing unwanted files on a user? Learn more.
Did you use an automated program to build a bunch of comment links? Learn more.
Google also has seven major algorithmic penalties that can affect your website’s rankings.
The Big List of Google Penalties for SEO
Google prefers quality websites. When the Panda algorithmdetects there are a lot of issues that the Panda algorithum looks at. You can learn more about that here.
This algorithm penalizes overly aggressive link-building schemes, including any links to and from your site that are intended to manipulate PageRank or a site’s ranking in Google search results. Penguin really hits sites hard that have too much anchor text for a term.
You can thank Hollywood and the recording industry for this one. In response to pressure from these powerhouse industries (and the government), Google now penalizes sites that receive multiple requests to remove copyrighted material. If you post content on your site that might fall within this domain, it’s a good idea to regularly check your Google Webmaster Tools account for any DMCA takedown notifications.
Google has increased their spam detection for highly spammy search terms like PayDay loan, Casinos, or Viagra. Learn more about this on here.
Designed to improve local searches, this algorithm won’t directly penalize your website. However, it can potentially lower your rankings if the SERPs that it targets contain locally searched keywords that do not abide by Pigeons ranking guidelines. Read our ultimate guide here.
This algorithm focuses on the meaning behind the words in a search query. If Google determines that your content doesn’t match the meaning of the words in a search query, it can reduce your rankings. You can read more about Hummingbird here.
- Google Hummingbird Update Impacts 90% of Searches
- Is your Content Hummingbird Quality?
- Hummingbird Update: Watch a Video by John E Lincoln
- In-depth review of Hummingbird
Google understands that website visitors don’t want to have to slog through a mountain of ads in order to find the information they’re searching for. If Google’s page layout algorithm can’t identify enough useful content “above the fold,” it will lower your ranking. To maintain high rankings, don’t make your readers search for content.
Cleaning Up the Wreckage
Okay, so you got penalized. Now what?
Google understands that violations, especially minor ones, can be unintentional. So they have a process in place whereby you can work with them to resolve the situation.
If your rankings have declined, or your site isn’t appearing in search results at all, check Google’s Manual Actions page. There you will find the nature of your penalty and the appropriate steps to address the problem. Once your site is back in alignment with Google’s Webmaster guidelines, you can request a review of your site directly from the Manual Actions page. If you think it is something else, check out the Panguin Tool. It will give you an overlay of all the Google updates against your analytics account. You can also check out our Google update timeline.
A 301 redirect is a page status code that tells search engines a page has permanently moved to another URL. The new URL might be on the same site or on a different one. You use a 301 redirect when you want to capture the traffic to the old address but the page is now physically elsewhere. For instance, if you change filenames, this usually leads to changes in the URL of the page and if you don’t do a 301 redirect, when visitors come to your old URL, they will see a Page Not Found message. There are many ways to perform a redirect and your options are limited by the technology (i.e. PHP, ASP, etc.) you are using. If you want to check if your URLs are redirecting properly, use the Redirect Check tool.
Unlike 301 redirect, a 302 redirect is used when you move pages temporarily. For instance, if you are doing major changes on your site that involve changing the URLs, and you need weeks to complete this, you can use 302 redirects for the time you are performing the changes. However, have in mind that 302 redirects are frequently abused . i.e. you redirect your traffic to an illegitimate site and search engines don’t always regard 302 redirects in a favorable way.
The Alt tag, more correctly called the Alt attribute of the <img> tag, is a way to tell search engines what an image is about. Since search engines don’t index the contents of images, if you don’t provide a meaningful description of the image in the Alt attribute of its tag, search engines won’t index any information about the image except its name. Good SEO practices require that you fill the Alt tag with meaningful description without resorting to keyword stuffing.
Anchor text is the text of a hyperlink – i.e. in this hyperlink: SEO tools, “SEO tools” is the anchor text of the hyperlink that leads to the home page of the site. Usually the anchor text is underlined but this is not mandatory. The importance of anchor text for SEO is significant because search engines pay a lot of attention to it and if there are keywords in the anchor text, this helps to rank better. Use the Backlink Anchor Text Analyzer to see if the anchor text of the backlinks to your site contains your keywords or not.
Backlinks, also called inbound links are links from other sites to your site. In SEO, the quality of backlinks (i.e. if the backlink comes from a reputable site, what the Page Rank of the linking site is, and what the anchor text is) is one of the major ranking factors. For more information, please refer to the The Importance of Backlinks article.
A bad neighbor is a site with low quality and/or illegal content. When you link to bad neighbors, this affects your rankings in a negative way. When bad neighbors link to you, this plays no role for your rankings because the assumption is you can’t control what the webmaster of the bad neighbor links to. For more details, see the Bad Neighborhood article.
Baidu is the Chinese search engine. In many aspects it is different from Google and if you want to learn how to rank well with it, consult the How to Optimize for Baidu article.
Bing is the Microsoft search engine and as such it is one of the biggest search engines. Its algorithm is different from that of Google, Yahoo, and the other search engines, as explained in the Bing Optimization article.
Black hat refers to a set of unethical SEO practices aimed at getting artificially high rankings with search engines. Black hat includes techniques such as keyword stuffing, cloaking, or the use of doorway pages. Sometimes webmasters inadvertently use black hat techniques . for instance when they go the over-optimization route . but still the end result could be getting banned from search engines.
A broad match is the opposite of exact match and it occurs when any of the search words (including synonyms) are found in the matched string. For instance, if the search words are “homes in LA”, when a broad match is used, the search results will also include matches for “homes in Los Angeles”, “LA home”, “real estate Los Angeles”, etc.
CTR, or Click Through Rate, is a measure used in Pay Per Click (PPC) advertising to measure the number of clicks per 100 visitors. If you are using Google Adwords to promote your site, CTR tells you how much attention your keywords are getting . the keywords with the highest CTR get the most attention.
Cloaking is an unethical, black hat SEO technique that involves serving different versions of the same page to crawlers and to human visitors. The version you serve to crawlers is highly optimized for good rankings, while the version you serve to your human visitors is made in a way to appeal to humans.
A crawler, also called a spider, is the search engine robot that crawls and indexes Web pages. It is an automated program that periodically visits a site.
Deep links are backlinks to internal pages. For instance, SEO tips isn’t a deep link because it points to the home page, while Keyword Difficulty is because it points to an internal page, i.e. the http://www.webconfs.com/keyword-difficulty-article-20.php page. You use deep links when you want to promote particular pages from your site, rather than the homepage.
Domain age is a measure how old a domain is. Generally, all equal, older domains get more favorable treatment by search engines than new domain. The reason is that new domains are often created just for backlinks, while older domains that have been around for years, are considered to be more trustworthy. You can check the age of a domain with the Domain Age tool and learn more about the term in the Age of a Domain Name article.
A doorway page, also called a gateway page, is another black hat technique that can get you banned from search engines. The technique employs the creation of numerous identical pages, each of which is optimized for a particular keyword. When the visitor comes through the doorway page, he or she is redirected to another page, aimed at human visitors.
Duplicate content . i.e. content that is verbatim the same or very similar to content found on other sites or other pages of your site – is one of the worst things that you can put on your site. Search engines penalize for duplicate content, so use the Similar Page Checker tool to see if you have content that is very similar to other pages online. For more details about what duplicate content is, read the Duplicate Content Filter article.
A dynamic URL is the opposite of Static URL. A dynamic URL is the result of a database query and it looks like this: http://www.yoursime.com/products.aspx?Y=2011&M=05. Dynamic URLs are SEO – and user–unfriendly and if you want to rank well with search engines, you must rewrite dynamic URLs (for instance with the URL Rewriting tool) into static URLs, as described in the Dynamic URLs vs Static URLs article.
An exact match is the opposite of broad match. When an exact search is performed, the search results include only the exact search string. For instance, if the search string is “homes in LA”, the search results will include only these pages where the “homes in LA” string is present and won’t include matches for “homes in Los Angeles”, “LA home”, “real estate Los Angeles”, etc. Exact matches are useful when searching for a competitive keyword because it filters the results, rather than delivering millions of broad matches.
Flash is an interactive media technology that makes sites more interesting. At the same time, Flash can kill your search rankings because search engines can’t index Flash content directly. Optimizing Flash Sites article explains what to do if you have Flash content on your site.
Similarly to Flash, frames are a burden in terms of SEO. When you use frames on a page, you use the same URL but show different content in each of the frames. This confuses search engines because for them one URL is one page of content, not multiple pages, as the case with frames is and search engines might not index the content of all the frames on a page.
Another name for doorway page.
Geotargeting is a technique to target audience based on geographic location. The criterion could be country, state, or city. Geotargeting is used in local search, too, when you are interested in traffic from a particular location only, not from the global traffic search engines bring by default.
The leading search engine. It is also a leader in PPC ads (AdSense and AdWords) and in email services.
Googlebot is the Google’s spider . i.e. the robot that traverses the Web and indexes pages for inclusion in Google’s database. You can see how your site looks to Googlebot with the Search Engine Spider Simulator tool.
The heading tag is used to separate the sections of text on a page. Heading tags are from <H1> to <H6>, with H1 being the title of the page. Very often H1 and the title tag are the same but for better results you can make them different. Keywords in heading tags weigh more than keywords in the body of the page, so don’t forget to use the right keywords in your tags.
A hyperlink is a link from one page to another or from one place on a page to another place on the same page. Hyperlinks are inbound and outbound. The hyperlinks that start and end on the same site are called internal hyperlinks.
Inbound links are links from other sites to your site. They are the same as backlinks.
- (Verb) The activity a spiderperforms when crawling the Web and collecting pages.
- (Noun) The database of indexed Web pages a search engine stores to retrieve search results from.
Internal links are the links from one page on a site to another page on the same site or from one place on a page to another place on the same page. Search engines give the same value to internal and inbound links, so it makes sense to have as many meaningful internal links on your site as possible. Internal links are also important for a site’s navigation.
Keywords are the major factor for SEO success. A keyword (or more commonly . a couple of keywords and/or keyphrases) describes what a page is about. When you have keywords in the title and the headings, this is good for your rankings. The higher the keyword density for a particular keyword, the better you rank for this keyword in search results. However, too high density could be regarded as keyword stuffing.
The keyword density for a particular keyword defines how many times this keyword is used on a page. Keyword density of 2 to 7 per cent is optimal because everything above this could be regarded as keyword stuffing and everything below this will make your page go down in search results. To check the keyword density of your pages, use the Keyword Density Cloud tool.
Keyword research is a major task for every SEO expert. Before you start any SEO, you need to find the right keywords for your site and niche. To do this, you use tools, such as the Website Keyword Suggestions tool, or the Keyword Playground tool. These tools suggest relevant keywords with high search volume . i.e. the keywords that makes sense to optimize for.
Keyword stuffing is a black hat technique, which involves artificially high keyword density for your target keywords. In the best case keyword stuffing can substantially lower your rankings in search results and in the worst, it can get you banned from search engines.
This is the page where the visitor arrives, when he or she clicks a link. The landing page could be the home page but it could also be any internal site page.
Link building . i.e. acquiring quality backlinks from other sites – is a major activity for every SEO. Link building is a very time–consuming process you can speed with the right tools, such as the Backlink Builder tool. Not all backlinks matter, as you will learn in the Link Building Mistakes article.
Link exchanges are one of the primary ways to link building. Link exchanges involve exchanging backlinks with other webmasters, preferably in the same niche.
Link farms are sites with many outbound links, often created with the sole purpose to raise other sites’ rankings. Link farms generally don’t rank well in search results and the backlinks from them are of low (if any) value.
Link spam is when you get dozens of backlinks in a short time, especially from low quality sites. Very often link spammers use automated tools to get backlinks but even manual submission to numerous sites and blogs is regarded as link spam.
Local search applies to those cases, when you use geotargeting and other techniques (i.e. your location as a keyword) to attract local visitors and to rank well in local search engines or in global ones for searches for your location. More details about local search can be found in the SEO for local businesses article.
Long tail keywords
Long tail keywords are keywords with low search volume but high value. For instance, “dating” is a very general keyword with millions of monthly searches but it is low value because the traffic you will be getting won’t be as targeted as the traffic for “single parents dating Los Angeles” for example. “single parents dating Los Angeles” is a long tail keyword.
The meta description is one of the most important meta tags because in it you write your description of the page, search engines pick it and show it in search results for visitors to see. This is why you need to write meaningful meta descriptions with keywords with them but without stuffing them.
Similarly to the meta description, meta keywords are also important for SEO because this is the place where you put your keywords and search engines read it. Meta keywords used to be more important in the past but due to constant abuse, now they have less weight but still they aren’t totally deprecated.
Meta tags are HTML tags, where you describe the contents of your page. Among the most important meta tags are the title, the meta description and the meta keywords.
Mod_rewrite is the Apache module for rewriting dynamic URLs into static. If you don’t want to mess with the configuration of Apache, use the URL Rewriting tool instead.
Navigation is the system of hyperlinks and other auxiliary items that help users find their way through the site. Internal links play a vital role in navigation.
A nofollow is an attribute of links, usually inbound links, that tells search engines not to follow the link. Links with the nofollow attribute are not counted by search engines (mainly by Google because some of the other search engines don’t take the nofollow attribute into account) and therefore don’t count as backlinks.
Organic search, also known as natural search, refers to the non–paid search results search engines serve based on the relevancy of the page where the search string is found. Organic search is the opposite of the paid search listings search engines serve for particular keywords.
Outbound links are the opposite of inbound links (more commonly known as backlinks). Outbound links are these links that start from your site and point to another site on the Web. When the number of your outbound links is high, this affects your rankings in a negative way, so keep the number of outbound links low.
A Page Rank (PR) is a measure used by Google to determine the popularity of a page. PR ranges from 0 (most unpopular sites) to 10 (most popular sites). Pages with high PR generally rank better in search results but this isn’t always so. One of the important implications of PR is with backlinks . getting a few backlinks from sites with high PR helps more than getting tons of them from low PR sites.
A page title is what you put in the title tag . it displays in the browser and tells users what your page is about. Clear and descriptive page titles with keywords in them do a great job for your good rankings.
Paid links are these links you get in exchange of payment. Buying and selling links is considered a huge offense by Google and if you get caught, the penalties are severe . from not counting the links at all, to exclusion from search index.
Poison words are these words that when used on a page, degrade the quality of the page for search engines. Usually it is adult words that have such a role but there are also other words that can literally poison a page with great content and bury it deep in search results.
Proximity measures how close the words in the search string are in their occurrence on the page. When the words are one next to another . i.e. they are an exact match of the search phrase, this page is considered highly relevant.
Reciprocal links are these links where you put a link to Site A on your site and Site A puts a link to your site in return. Reciprocal links are a common form of a link exchange. You can use the Reciprocal Link Check tool to see how many of the reciprocal links you had still exist.
A redirect is a way to tell search engines that the URL of a page has changed. When you use redirects, you don’t lose the traffic to your old URL. 301 Redirect (permanent redirect) and 302 Redirect (temporary redirect) are the two most popular ways to redirect a page.
Reinclusion is the act of being included again in Google index. Most often sites are removed from Google index because of the use of black hat SEO and if they want to be reincluded, they need to clean their act. The Reinclusion in Google article details what to do in order to be admitted in Google again.
robots.txt is a file, where you tell search engines which pages/sections of your site not to index. However, while generally search engines respect your wishes and don’t spider the pages/sections you don’t want them to, if you have sensitive information, don’t think it is safe, if you have listed it as not–spiderable in robots.txt. For more information about robots.txt, refer to the What is robots.txt article.
A sandbox in Google is a place, where new sites are temporarily put before Google includes them in search results. When a site is in the sandbox, it is indexed by search engines but results from it don’t show in the search results. Instead, results from sandboxed sites are available in the supplementary results,which generally get much less traffic than the standard results pages.
A search engine is a software application that is used to spider, store, index, and display search results from the pages around the Web. Google, Yahoo, and Bing are the three major search engines.
SEO, an acronym for Search Engine Optimization, is a set of tools and skills used to finetune pages, so that they rank well in search engines. Two of the main SEO activities are keyword research and link building.
SEO tools are various programs and pieces of code that allow to perform SEO tasks. For instance, there are tools for keyword researchand link building, for Backlink Anchor Text Analysis, etc.
SERP is an abbreviation for Search Engine Results Page . i.e. the page search engines display with the results for a particular search query.
A sitemap is a file, usually in HTML or XML format, where links to the important pages on the site are gathered in one place. Sitemaps are useful both for human visitors because they help navigation and for search engines. For more details about sitemaps, please refer to the Importance of Sitemaps article.
Spiders are search engine robots that crawl the Web and index the Web pages they encounter. Only pages that are spiderable are included in the search engine databases and are later shown in search results. This is why it is important that your pages are spiderable. You can check this with the Search Engine Spider Simulator tool. For more details about spiders, please refer to the The Spider View of Your Site article.
A splash page is a home page with very limited content. Typically, a splash page will include only “Click to enter” or a similar text. Splash pages are bad for SEO because they lack content. Splash pages are sometimes made in Flash, which makes them even worse, unless optimized properly.
A static URL is the opposite of Dynamic URL. A static URL is always the same and if we use the example from the Dynamic URL entry (http://www.yoursime.com/products.aspx?Y=2011&M=05), the static equivalent will be http://www.yoursime.com/products/2011/05. Static URLs are better for your human visitors and for search engines, as explained in the Dynamic URLs vs Static URLs article. This is why it is highly recommendable to rewrite dynamic URLs into static . you could use the URL Rewriting tool for the purpose.
- In HTML, a tag is a pair of an opening and closing bracket (< and />) that tell the browser how to interpret a particular piece of code . i.e. as text, image, title, table, etc. Title tags, headings, and metatags are the most important tags for SEO.
- In social bookmarking this is a form of labeling entries, which usually includes keywords.
The Title tag (<title>Enter your title here</title>) is displayed in the browser titlebar and it contains the title of your page. Keywords here and a descriptive title are vital because this is one of the first things visitors see from your page.
In order to rank well with search engines, you’d better rewrite dynamic URLs into static. You can do it with the URL Rewriting tool.
A user agent is a program used to access the Net. A user agent can be a browser or a spider. Usually log files give more details about the user agents that have visited your site, such as the name of the browser and its version.
Vertical search is a type of specialized search that focuses on results from particular area only. For instance, when you use a search engine that indexes only business–related pages, you are performing a vertical search. The advantage of vertical search is that the results it delivers are better targeted.
The second largest search engine and also one of the oldest. Optimizing for Yahoo is different from optimizing for Google and Bing and Yahoo’s search results for the same term differ from those of its competitors.