How Will Google Affect SEO?






    Updates to Google's Algorithms and Manual Actions/Penalties 


    Google fine-tunes and changes its algorithm on a daily basis, while major algorithm improvements are released on a regular basis. 

    Furthermore, they actively check their results to identify sites that are breaching their standards (https://bit.ly/webmaster_best_practices), and such sites may face ranking penalties as a consequence. 

    All of these approaches are intended to aid them in improving the overall quality of their search results. These algorithm tweaks and penalties may sometimes have a significant influence on your organic traffic. 

    A significant drop in search engine traffic to your website may be detrimental to a company's bottom line. 

    This sort of revenue loss may necessitate layoffs or perhaps the closure of the company. As a result, you'll need a basic grasp of how the Google ecosystem works, how Google suggests you run your website, and the many circumstances that might result in lost visibility and traffic. 

    Otherwise, Google updates or fines may have an influence on you, and it may seem that this occurred due to circumstances beyond your control. 

    However, if you have a good grasp of what Google is attempting to do with its algorithm adjustments and penalties, you may dramatically decrease your exposure to them and perhaps set yourself up to avoid fines and gain from the improvements. 

    If you've already experienced a traffic loss as a result of an upgrade or penalty, it's critical to know what caused it and what you need to do to recover. 

    Updates to the Google Algorithm. 



    Google's numerous search algorithms are updated in a variety of ways, including changes to search functionality, changes to search result composition and style, changes to various parts of relevance and ranking algorithms, as well as daily testing and bug corrections. In this part, we'll look at the many sorts of adjustments Google makes and how they affect the search results that users interact with.  
     

    BERT.

     
    Google announced the existence of BERT (https://www.blog.google/products/search/search-language-understanding-bert/) on October 25, 2019. Bidirectional Encoder Representations from Transformers (BERT) is a neural network-based natural language processing approach (NLP). 

    This is what Google had to say about BERT's impact: "BERT will help Search comprehend one out of every ten searches in English in the United States, and we'll expand this to new languages and locations over time." 

    Prior to BERT, when Google's algorithms were attempting to figure out what a word or phrase meant, they could only look at neighboring text that came before or after it. 

    It was essentially unidirectional. BERT allows Google to grasp the meaning of a word or phrase by analyzing the text before and after it. BERT was initially solely applied to US language inquiries, with Google claiming that it had an effect on 10% of those queries. 

    They stated on December 9, 2019 that BERT has been expanded to include 70 languages. https://searchengineland.com/bert-is-rolling-out-to-google-search-in-over-70-languages-326146 In addition to BERT, Google also released a paper on SMITH, a novel algorithm. This algorithm has the potential to be the next step after BERT. 

    What SMITH may bring to the table is the ability to comprehend lengthier sections inside large texts in the same manner that BERT comprehends words and phrases. It was unclear if the SMITH algorithm had been brought up into Google Search as of November 2021, but it shows that Google is still looking on ways to enhance natural language processing.  
     

    Subtopics and Passages.

     
     Google stated on October 15, 2020 that they will be releasing two new search algorithms (https://www.blog.google/products/search/search-on/). 

    The first of these was an algorithm that allowed them to categorize their search results into subjects and subtopics. The idea for this came from Google's discovery that, in many situations, wide user inquiries are quickly followed by other questions aimed at narrowing down what the user is seeking for. 

    For example, if a user searches for "Home Exercise Equipment," Google may provide some initial results as well as subsections for "Affordable Exercise Equipment" and "Small Space Exercise Equipment," since these are common follow-up questions. 

    In January 2021, Google's Danny Sullivan revealed that the subtopics algorithm was launched in mid-November 2020. 

    Another of the algorithms disclosed was one that would allow them to recognize and "index" certain portions inside a web page independently of the rest of the page's content. 

    The goal of this modification was to enable them to respond to highly precise customer inquiries. The importance of this

     approach stems from the fact that many customer requirements are quite detailed. While the answers to these questions may be found in a variety of locations on the internet, many of them are hidden among other information whose overall relevance may not be well aligned with the individual user inquiry. 

    With this change, Google will be able to distinguish certain parts within a bigger text that are relevant to a particular query. The first version of the Passages algorithm was pushed out on February 11, 2011, according to Google's Danny Sullivan (as @SearchLiaison on Twitter). 
     

    Core Web Vitals and Page Experience.

     
     Google stated on May 28, 2020 that they will start utilizing a new signal called Page Experience (https://developers.google.com/search/blog/2020/05/evaluating-page-experience). 

    This was followed by a series of explanations about the new signal's rollout schedule. The Page Experience deployment started in mid-June 2021 and was projected to be completed by the end of August 2021 (https://developers.google.com/search/blog/2021/04/more-details-page-experience). 

    The Page Experience signal is made up of a number of pre-existing signals that all have something to do with whether or not your site provides a positive user experience. 

    Because it makes the idea of Page Experience as a ranking component much easier to handle, Google has combined all of these signals into one bigger score in the overall Google algorithm. 

    Page Experience's total weight may be viewed as a single signal, and the relative weighting of the distinct components can be determined independently of the main algorithm. Furthermore, if Google decides to introduce a new page experience related signal, it may be readily added to the Page Experience signal without affecting the bigger algorithm. 

    While Page Experience is vital, keep in mind that the most significant signals are always content relevancy and quality. For example, simply because your website about tadpoles is quick, it won't start ranking for user search queries concerning cooking pots. 

    Similarly, even if your content is very relevant, a high Page Experience score would not help you rank it. However, there are also occasions when searches are very competitive, with several viable sites offering high relevance and high quality to satisfy the user's needs. In these situations, the Page Experience signal might help you rank somewhat higher than your competitors. 
     

    Update on Link Spam.

     
     Google announced yet another important change in 2021, this time focusing on link spam. The Link Spam Update, as it was dubbed, started rolling out on July 26, 2021 and ended on August 24, 2021. In a blog post titled "A reminder on qualifying links and our link spam update" (https://developers.google.com/search/blog/2021/07/link-tagging-and-link-spam-update), Google detailed the nature of this modification. 

    While the article does not expressly address what the link spam upgrade addressed, it does begin with a discussion of affiliate links and guest blogging difficulties. 

    This includes a reminder on the importance of using link tags like NoFollow, Sponsored, and UGC where applicable. This isn't to say that other parts of link spam weren't considered, but it does indicate that they were the primary emphasis. 

    What it does show is that Google is still grappling with link spam, and although they have made significant progress over the years, there is still space for improvement. 

    Updates to the Broad Core Algorithm. 


    Google started announcing what it terms Broad Core Algorithm Updates in March of 2018. (BCAU). Since then, Google has been unveiling significant changes on a regular basis, and as Sullivan pointed out, they occur multiple times every year. 

    It's also worth noting that these Google verified changes are ones that they think important enough to confirm, but Google releases many more updates about which they choose not to comment. 

    Aside from these verified improvements, the industry has highlighted a slew of additional dates when Google's algorithm tweaks seem to have had a greater effect. 

    These unverified changes may have a big influence, with a lot of websites gaining or losing traffic as a result. 

    Furthermore, Google updates its algorithms on a regular basis. Google's Danny Sullivan said in July 2019 that the company has made over 3,200 algorithm adjustments in the previous year (https://www.blog.google/products/search/how-we-keep-google-search-relevant-and-useful/). 

    Changes in Functionality

     
    Google makes changes to its search engine on a regular basis. Some of these are also announced. Bug Fixes on Google Because Google Search is such a huge and complicated ecosystem, it's unavoidable that problems may appear from time to time. 
     
     

    Webmaster Guidelines from Google


    If you're the owner/publisher of a website and want to increase your Google traffic, it's important to learn Google's Webmaster Guidelines (https://bit.ly/webmaster_best_practices). These are the guidelines that Google expects webmasters to follow while creating and maintaining their websites. While Google cannot compel you to follow these principles, it may choose to penalize websites that do not. 


    Fundamental guidelines that Google expects webmasters to follow:  

     

    Construct pages with users in mind, not search engines. 

     
     This is a crucial component of any company's online presence. Knowing what your target consumers want, how they search, and how to provide that information in a comprehensible and entertaining manner is smart business, and it's also beneficial for Google rankings. 
     

     Don't mislead your customers. 

     
    Unfortunately, this one has made it here because many websites utilize bait and switch methods to lure consumers into material and experiences that aren't what they anticipated. 

    For example, sites with Cumulative Layout Shift issues (covered in the Page Experience portion of this article) may lead visitors to click on the incorrect area of a page, resulting in a bad user experience. Avoid using techniques to boost your search engine results. 

    A decent rule of thumb is to consider if you'd feel comfortable defending your actions to a competitor's website or a Google employee. 

    "Does this aid my users?" is another good question to ask. 

    "Would I do this if there were no search engines?" Take note of the principle's last phrase. 

    It may seem naive at first, but if you learn that Google actively tweaks its algorithms to locate the sites that serve people the best, it all makes sense. 

    All of Google's algorithms are being fine-tuned to discover the greatest user experiences, thus concentrating your efforts on providing exceptional value for people is tightly linked to increasing your chances of ranking in Google. 

    Consider what makes your website special, useful, or interesting. 

    Make your website stand out from the competition in your industry. 

    A user-centered approach is vital, but not sufficient. You should also attempt to establish a site that stands out, just as you would if you were running a company. 

    Otherwise, there will be nothing on your website that would entice consumers to visit it, and Google will have no motivation to rank it highly in the search results. 

    Google outlines a number of particular rules in addition to these fundamental concepts. These are grouped into two categories: practices to avoid and behaviors to follow. 
     


    Practices to Stay Away From.

     
     
     

    Automatically produced content.

     
     In this case, Google is focusing on pages that are created artificially for the goal of getting search traffic but contribute no actual value. Of course, if you operate a retail site, you may be utilizing your ecommerce platform to produce pages that reflect your product database automatically, but Google isn't concerned with that. 

    This is more aimed towards machine-generated (a.k.a. "mad-libbed") stuff that consumers don't understand. Participating in link schemes. 

    Because connections to your site remain a key component of the Google algorithm, several parties are providing techniques to produce links to your site inexpensively and artificially. Concentrate your efforts instead on attracting links that indicate authentic citations of your site.  
     

    Creating pages with little or no original material.

     
     This may take various forms, including automatically created pages, pages with little or no user value or purpose that exist only to persuade someone to click on an affiliate link, content stolen from other sites, and gateway pages.  

    Cloaking 

     
    Cloaking is "the technique of displaying distinct information or URLs to human users and search engines," according to Google. The reason this is a problem is because some websites were built to provide Google with a rich informative experience that Google may choose to rank, but when users came at the site, they got something completely different. 
     

    Sneaky redirects. 

     
     This is when you utilize redirects to route people to a different page than what Googlebot sees. Users may be sent to information that does not match what they anticipated when they click on a link in a Google search result, much as they were with cloaking.  
     

    Text or links that are hidden.

     
     These are spammy practices that date back to the early days of search engines, in which material is presented on a website in such a manner that it is not visible, such as putting white text on a white background or positioning it far off the page using CSS. A frequent spam strategy with links was to include a link to a page but only use one character as a link, such as a hyphen. 
     

    Doorway pages.

     
     These are pages that were designed exclusively for the aim of attracting search engine traffic, rather than to provide a fantastic user experience. In actuality, they are often produced in large quantities and are poorly integrated with the rest of the website. They might also be programmed to target a large number of search keywords that are quite close but not identical. 
     

    Content that has been scraped. 

     
     Taking information from other websites and republishing it on your own is not just a copyright infringement, but it's also frowned upon by Google. Minor changes, such as the use of synonyms, are also insufficient. If you're going to quote material from another website, be sure to give credit to the original source and add your own unique value.  
     

    Taking part in affiliate programs without offering enough value.

     
     In the past, Google had a lot of issues with sites that made all of their money from affiliate programs finding methods to rank low-quality material in the search results. There's nothing wrong with earning a portion of your income from affiliate programs, or even all of it. If you don't have much valuable material to give consumers, That site will not be ranked by Google. 
     

    Loading pages with keywords that aren't relevant.  

     
     Also known as "keyword stuffing," cramming your website with unnecessary or unnecessarily repeated phrases detracts from the user experience and is considered spammy by Google. Creating dangerous websites, such as phishing or installing viruses, trojans, or other malicious software The reasons for Google's refusal to include certain sites in search results are evident, although they are not necessarily the consequence of the website's publisher's actions. Sites may be hacked, so it's important to be alert about maintaining the security of your site and checking to see if it's been hacked on a frequent basis. 
     

    Structured data markup is being abused.  

     
     Structured data allows you to improve the look of your listing in Google's search results, but it also has the potential to be abused.  
     

    Sending automated inquiries to Google.

     
     This is the practice of sending massive numbers of searches to Google using automated technologies. This sort of activity is often used for rank monitoring, but Google doesn't like it since it wastes their resources with little value to them. Many technologies, such as Brightedge, Searchmetrics, SEMrush, seoClarity, Conductor, and others, provide large-scale rank tracking. Using one or more of these tools may be a very useful part of your SEO strategy, as long as you don't overdo it. Follow These Good Hygiene Practices This list is rather brief and concentrates on two topics that reflect optimum site hygiene procedures. 
     

    Watching for hacking on your site and deleting hacked material as soon as it appears.

     
     Unfortunately, this is more prevalent than you may think. Hackers use programs that scour the internet for security flaws, then use those flaws to inject their code into your web pages, frequently in the form of invisible links to their own. 

    One of the most effective measures you can use to reduce your risk is to maintain your software platform up to current at all times. If you use Wordpress, for example, always install the most recent updates as soon as they become available. This would also include any plugins. 
     

    Preventing and eliminating spam from your site created by users.

     
     Any site that enables users to submit material in any way runs the danger of receiving spammy content. If you enable comments on your material or host forums on your site, for example. 

    Some unscrupulous actors may manually insert spammy information, while others use programs that cruise the internet seeking for areas to leave comments or posts on websites. Some of the best practices here include demanding moderation of all comments or posts, or examining all comments or posts as soon as they are published. 

    There are gradations to this, such as requiring moderation of any user's initial remark or post, but allowing them to submit further material without approval after that. You should, however, make time to evaluate such contributions after they've been submitted. User-generated spam may also be found on freehosting services, which allow anybody to set up websites without spending any money. 

    If you run a freehost platform, you'll need to follow identical steps to guarantee that you don't end up with spammy material on your site. Take a look at Google's Webmaster Guidelines (https://developers.google.com/search/docs/advanced/guidelines/webmaster-guidelines). Anyone who starts to invest proactively in improving their organic search presence should be aware of these guidelines and take actions to ensure that their company does not breach them.  
     

    High-Quality Content 


    Because we, as website content creators, desire Google traffic, it is our responsibility to create high-quality material. This necessitates an understanding of our target audience, how and what they search for, and then providing high-quality content wrapped in an excellent user experience so they can quickly find what they're looking for. 

    However, as you would think, developing high-quality content isn't always straightforward, and many people try to cut corners, which may lead to low-quality or even spam-like material appearing in search results. To address this, Google does and searches for a variety of things to guarantee that low-quality material does not appear in the SERPs. 

    On February 24, 2012, Google made a huge stride forward when it unveiled the Panda algorithm, which was a decade ago. Google said the following in their release statement (http://bit.ly/more_high-quality): 

    Many of the adjustments we make are so little that they go unnoticed by most others. 

    But, in the last day or so, we've implemented a significant algorithmic change to our ranking—a change that affects 11.8 percent of our queries—and we wanted to let people know about it. 

    This update aims to lower the ranks of low-quality sites, such as those that provide little value to visitors, replicate material from other websites, or are just not very helpful. Simultaneously, it will boost the ranks of high-quality sites with unique material and information, such as research, in-depth reporting, and intelligent analysis. 

    The most significant change Panda made to the Google landscape was that it improved Google's ability to evaluate content quality. Downgrading sites that published low-quality material in big numbers in order to garner significant amounts of search traffic was one part of this. Panda, on the other hand, evolved over time to address challenges of material quality on a far greater scale. 

    Panda was formerly a distinct algorithm from the main Google algorithm, however Google declared in January 2016 that Panda has been completely merged into the main algorithm. Google's algorithms continue to prioritize content quality. 
     



    Content that Google despises. 



    The following are some of the main sorts of material that Google believes to be poor:
     

    Content that isn't very thin.

     
    This is described as pages with very little content, as one would imagine. User profile pages on discussion sites with minimal information filled in, or an ecommerce site with millions of goods but little information for each one, are two examples.

     

    Unoriginal material. 

     
    These might be scraped pages or pages that have simply been slightly modified, and Google can readily discover them. Google algorithms may penalize websites with even a modest number of these sorts of pages. 

    Nondifferentiated material. Even if you write 100% unique articles, this may not be sufficient. If every page on your site covers issues that have been covered hundreds or thousands of times previously, your site isn't truly adding anything new to the Web. 



    Poor Quality Content.



    Material that is erroneous or badly constructed is referred to as  poor-quality content.
     
     This may be difficult to notice in many circumstances, but material with bad language or many spelling errors is one clue. Google may also use fact-checking as a means of identifying low-quality material. 

     

    Curated content

     
    Google algorithms penalize sites with a significant number of pages including lists of curated links. Although content curation isn't intrinsically negative, it's critical to include a large amount of meaningful commentary and analysis if you're going to do it. Those with a lot of links will score poorly, as will pages with a lot of links but just a tiny quantity of original material. 
     

    Thin slicing

     
     was formerly a prominent strategy employed by content farms. Let's say you wanted to write about colleges that provide nursing degrees. Many articles on basically the same subject would be published on content farm sites. Creating articles with names like "nursing schools," "nursing school," "nursing colleges," "nursing universities," "nursing education," and so on is an example. There is no need for all of those various articles since they will be identical in terms of content.  
     

    Content produced by a database.

     
    Using a database to produce web pages isn't intrinsically wrong, but many businesses were doing it on a massive scale. 

    This might result in a lot of thin-content or low-quality pages, which Google dislikes. It's worth noting that ecommerce systems effectively generate content from a database, which is OK as long as you work hard to create compelling product descriptions and other information for those sites. 

    Diverse Content Is Important.

     
     For Google, diversity is critical to overall search quality. The search query Jaguar is an easy method to demonstrate this. This term may refer to anything from an animal to a vehicle to a guitar to an operating system to an NFL franchise. 


    The Role of Authority in Content Ranking.

     
     While Google provides a lot of results on a subject, there are a few sites that score well for this search query. What factors go towards determining their rank? 

    When producing information on a subject that is already widely covered on the Internet, really high-authority sites are likely to fare OK. There are a few plausible explanations for this: A lot depends on your reputation and authority. 

    Even if the New York Times Lifestyle section published a new story about how to make French toast, readers could react warmly to it. 

    Because of the site's repute, user engagement signals with the search result for such material would most likely be fairly high. High-authority sites are presumably that way because they don't participate in a lot of the conduct that Google warns webmasters about. 

    You're unlikely to come across a lot of thin material, "me too" stuff, thin slicing, or any of the other flaws that Google's algorithms target. A high-authority site may simply be subjected to a broader set of standards than other sites. It's unclear what characteristics give higher-authority sites greater wiggle room. 

    Is Google evaluating the user's engagement with the material, the content's quality, the publisher's authority, or a combination of these factors? 

    What Google does most likely has parts of all three. 

    Weak Content's Impact on Rankings 


    Even poor content on a single portion of a bigger site might lead Google to reduce the site's overall ranks. 

    This is true even if the material in issue accounts for less than 20% of the site's pages. As shown in Figure 2-13, this may not be an issue if the remainder of your site's content is excellent, but it's better not to risk it if you have known poor sites that are worth the time to fix.  
     

    Improving Content That Isn't Good.

     
     When dealing with thin content, it's essential to delve deep and ask tough questions about how to create a site with a lot of great material and plenty of user interaction and engagement. You want to generate highly distinctive content that people seek, like, share, and connect to on your site. Creating content that people will interact with is a science. We all know how crucial it is to choose interesting headlines for our material, and we also know how vital it is to include captivating visuals. Make it a point to learn how to generate compelling content that people will want to read, and then apply those concepts to each page you make. Furthermore, track your interaction, experiment with alternative ways, and enhance your ability to create amazing content over time.  
     

    Actions to do if your pages aren't performing well.

     
    Addressing your site's weak pages should be a large part of your emphasis when you review it. They might take the shape of a complete section of low-quality material or a few of pages strewn throughout your site's higher-quality content. 

    Once you've found those sites, you have a few options for dealing with the issues you've discovered: Make the material better. This might include rewriting the information on the website to make it more appealing to visitors. The noindex meta tag should be added to the page. 

    This will instruct Google not to index certain sites, thereby removing them from the Panda equation. 301-redirect users to other pages on your site instead of deleting the pages. 

    Only use this option if there are quality pages that are related to the ones that have been removed. When someone attempts to browse a page that has been removed, return a 410 HTTP status code. This informs the search engine that the pages on your site have been deleted. 

    To remove a page from Google's index, use the URL removal tool (http://bit.ly/remove_content). This should be approached with caution. You don't want to remove other high-quality sites from Google's index by mistake!  
     

    High-Quality Links 



    We simply need to look at Larry Page and Sergey Brin's original thesis, "The Anatomy of a Large-Scale Hypertextual Web Search Engine" (http://infolab.stanford.edu/backrub/google.html), to see how Google employs links. 

    This paragraph appears at the start of the thesis: The web's citation (link) graph is a valuable resource that is frequently ignored by present online search engines. 

    We've made maps with up to 518 million of these linkages, which is a large portion of the total. These maps make it possible to quickly calculate a web page's "PageRank," an objective measure of its citation relevance that closely matches people's subjective perceptions of importance. 

    PageRank is a great approach to rank the results of online keyword searches because of this correlation. 

    The notion of a citation is quite important. 


    The citation list is used by the article's author to recognize important sources he used while writing the paper. 

    If you looked at all of the articles on a specific subject, you could pretty quickly figure out which ones were the most significant since they had the most citations (votes) from other publications. 

    Consider what links represent to understand why they are valuable as a signal for search engines. When someone connects to your website, they are inviting visitors to leave their own and visit yours. In general, most website publishers want to attract as many visitors as possible to their site. 

    Then they want those visitors to do something useful on their site, like purchase something, watch advertising, visit a lot of pages to see a lot of commercials, or click on adverts. It may simply be to convince the visitor to read your complete perspective on certain sites where expressing a strong opinion on a contentious issue is the purpose. 

    The direct economic value of a user clicking on a link to a third-party website that is not an ad might be difficult to perceive in any of these circumstances. 

    Finally, individuals employ links when they feel they are pointing a user to a high-quality online resource that will provide value to that person. 

    This adds value to the site that implemented the link since the user will have had a positive experience on their site because they linked the user to a helpful resource, and the user may return for future visits. This information is used by Google to help it evaluate which resources on the web are of the highest quality. 

    For example, if someone types in "create a rug," Google would likely return tens of thousands of sites that explore the subject.

    What criteria does Google use to determine which is the best, second best, and so on? 

    Even the most advanced AI systems are unable to make this conclusion just based on content analysis. Links allow Google to see what other people on the internet consider to be valuable resources, and they serve as an input to their algorithms for judging content quality. Not all connections, however, are valuable. 

    Ads, of course, are skewed due to the fact that they are paid for. Low-value links, as well as those containing any information or experience about the subject, are likely to be penalized. Furthermore, many sites continue to try to manipulate the link algorithm in order to get high ranks without really deserve them. 

    It's important to know what forms of conduct are unnatural and hence likely to be ignored or punished by Google, in addition to knowing why certain sites could naturally adopt connections to a third-party site. In the academic environment, for example, you cannot purchase the placement of a citation in someone else's research paper. 

    You don't barter for such placements ("I'll mention you in my paper if you mention me in yours"), and you surely wouldn't sneak references to your work into someone else's research paper without the writer's permission. 

    You wouldn't publish dozens or hundreds of badly written articles solely to get more references to your work in them, either. 

    You wouldn't upload your work to dozens or hundreds of sites set up as repositories for such papers if you knew no one would ever read it or if the repositories included a large number of fraudulent papers with which you didn't want to be linked. 

    In theory, you are unable to vote for yourself. All of these cases, of course, took place on the Internet and included connections. All of these techniques are in direct opposition to how search engines seek to utilize links, since they rely on connections that have been gained via merit. 

    This implies that search engines do not want you to buy links in order to manipulate their results. Of course, you may purchase ads—nothing there's wrong with that—but search engines prefer ad links with the nofollow property, which tells them not to count them.

    Furthermore, pure barter relationships are either undervalued or disregarded entirely. 

    From 2000 to 2005, it was common to send individuals emails offering to link to them in exchange for them linking to you, on the theory that this would assist with search engine results. Of course, these kinds of connections aren't true citations. 

    Links from user-generated content sites, such as social networking sites, will also be ignored by Google. 

    Anywhere where individuals may connect to themselves is a site that search engines will ignore or even penalize if they discover harmful behavior patterns. 

    Google spent a lot of time and money creating systems for identifying low-quality connections. For many years, it was a labor-intensive procedure. 

    However, with the first release of the Penguin algorithm on April 24, 2012, they made a major stride ahead. 

    Penguin marked the beginning of their practice of automatically recognizing low-quality connections and either rejecting them or imposing an algorithmic penalty on the sites that received them. 

    Until the release of Penguin 4.0 on September 23, 2016, Penguin ran independently from the main algorithm and only updated on a periodic basis. Penguin had been entirely assimilated into the main algorithm as of that date. 

    Google's algorithm was also altered on that day to concentrate entirely on finding low-quality connections and downgrading them to zero value. 

    Google's trust in the Penguin notion had risen to the point where penalizing these connections was no longer necessary. Google's web spam team, on the other hand, continues to manually evaluate link profiles for sites that are suspected of having a suspicious link profile and may levy penalties against them.

    In the Penalties portion of this article, we'll go through this in further detail. As a result, it's a good idea to know what kinds of connections Google doesn't like. 
     

    Links That Google Dislikes

     
    The following is a list of several sorts of links that Google may deem less useful, if not worthless at all:  
     

    Article directories 

     
     are a kind of article directory. These are websites where you may submit an article for publication with little or no editorial scrutiny. All you had to do was post an article and it may include links back to your site. The difficulty is that this is a type of self-voting, and discovering connections from these sites is rather simple for Google to detect.
    Many directories on the Internet exist only to collect money from as many sites as possible. The owner's primary aim is to collect as many listing fees as possible in these sorts of directories, which have little or no editorial scrutiny.  
     

    Links from nations where you don't conduct business 

     
     If your firm exclusively does business in Brazil, there's no purpose to have a lot of links from Poland or Russia. There isn't much you can do if someone choose to offer you connections that you didn't ask for, but there's no reason to participate in activities that would lead to you receiving links from such nations. Links from other countries that have a link in a different language Some SEO experts go out of their way to gain links from all over the place.  
     

    Comment spam. 

     
     Dropping links in comments on forums and blog articles was once a common method. Since Google added the nofollow property, this strategy has become much less lucrative, yet active spammers continue to pursue it. In reality, they deploy bots to leave automatic comments on blog articles and forums all over the Internet. They may leave 1 million or more comments in this manner, and even if only.001 percent of those links are not nofollowed, the spammers will still get 1,000 links.  
     

    Guest post spam

     
     refers to badly written guest articles that provide little value to visitors and were created just to get a link back to your own website.  
     

    Guest posts that have nothing to do with your site. 

     
     This is a sort of guest post spam in which the content created has nothing to do with your website. If you sell old automobiles, don't expect Google to think a guest article on lacrosse equipment with a link back to your site is valuable. 
     

    In-context guest post links 

     
     have no significance. Posts that contain links back to you in the body of the article are another kind of guest blogging that Google dislikes, especially if the links are keyword-rich and don't offer much value to the post itself.  
     

    Advertorials 


    This is a kind of guest post that is written in the style of an advertisement. Given the structure, it's quite probable that the website that posted it was swayed in some way. Focus on sites that don't allow these sorts of guest posts if you're going to use guest blogging as part of your approach. While the above four instances all include guest posts, Google generally frowns on any form of guest blogging done for the purpose of link building. This isn't to say you shouldn't guest post; nevertheless, your objective should be to encourage people to read your material rather than to acquire links.  
     

    Widgets.

     
     Creating helpful or fascinating tools (widgets) and enabling third-party websites to distribute them on their own sites has become a popular strategy. Normally, they included a link to the widget creator's website. In theory, there is nothing wrong with this approach if the material is extremely relevant; nonetheless, the strategy was overused by SEOs, causing Google to disregard many of these sorts of connections.  
     

    Infographics.

     
     This is another area that, although theoretically permissible, has been heavily exploited by SEOs. At this time, it's unclear what Google does with these links, so you should only produce infographics if they're really relevant, helpful, and (of course) correct.  
     

    Anchor text that is misleading.

     
     This is a more nuanced problem. Consider the case where a link's anchor text states "information about golf courses," yet the page to which the link is sent is about tennis rackets. This is not a pleasant user experience, and it is not something that search engines will like. # 
     

    Malware-infected sites 

     
     Obviously, Google looks to disregard these sorts of connections. Malware-infected websites are very detrimental to users, therefore any link from them is worthless and even dangerous. # 
     

    Footer links. 

     
     While there is nothing fundamentally incorrect with a link in the footer of someone's website, Google may devalue its value since these links are less likely to be clicked on or seen by people. Read Bill Slawski's essay "Google's Reasonable Surfer: How the Value of a Link May Differ Based on Link and Document Features and User Data" (http://bit.ly/reasonable surfer) for more information on this issue. 
     

    Unrelated links in a list.

     
     This might be an indication of a bought link. Assume you come across a link to your "Travel Australia" website among a list of links that also includes an online casino, a mortgage lead generation site, and a lottery ticket site. This does not seem to Google to be a positive thing. 
     

    Links from low-quality sites.

     
     The most valuable links are those that originate from extremely high-quality sites that demonstrate a high level of editorial control. Conversely, when quality declines, so does editorial control, and Google may stop counting these connections altogether. 
     

    News releases.

     
     It was once fashionable to send out a large number of press releases, each containing keyword-rich text links back to your website. Of course, this is a type of self-voting, and press releases should not be used to promote your site in this manner.  
     

    Make a list of websites to bookmark. 

     
    Delicious, Evernote, Diigo, and other great services for storing intriguing links for your personal use are just a few examples. However, since they are user-generated content sites, their links are nofollowed and have no effect on your site's rating. Not all of the sorts of connections listed above will necessarily result in a penalty for your site, but they are all instances of links that Google will most likely ignore. 

     

    Removing Low-Quality Backlinks 


    The first step in the link cleansing procedure is to get into the correct frame of mind. Consider how Google views your links when you analyze your backlink profile. 

    Here are some general guidelines for determining if a link is valuable:

    If Google and Bing didn't exist, would you want that link? 

    Would you happily display it to a potential client before she makes a purchase? Was the URL provided as a legitimate recommendation? 

    You may find yourself attempting to justify the usage of a link when you analyze your backlinks. 

    This is typically a solid indication if the connection isn't working.

    High-quality linkages don't need to be justified; their value is self-evident. 

    Recognizing the need to be thorough is another important component of this approach. It's terrifying to lose a lot of traffic, and it's normal to feel impatient. If your site has been hit with a manual link penalty, you'll be eager to submit your reconsideration request, but once you do, there's nothing you can do except wait. 

    If you don't do enough to eliminate harmful links, Google will reject your request for reconsideration, and you'll have to start again. If you submit a number of reconsideration requests without result, Google may give you a notice advising you to take a break. 

    Make a point of eliminating and disavowing connections as quickly as possible, and don't attempt to preserve a lot of minor ones. In the end, this nearly always speeds up the procedure. 

    Furthermore, those dubious links that you attempt to preserve on a regular basis aren't really benefiting you. With all of this in mind, you'll want to complete the procedure as swiftly as possible. 
     

    Data Sources for Link Cleaning. 

     
     In your site's Search Console account, Google displays a list of external links. 

    Because this list is prone to being incomplete, we suggest that you gather links from a variety of additional sources. Ahrefs (https://ahrefs.com/), Majestic SEO (https://www.majestic.com), SEMrush (https://www.semrush.com), Link Explorer (https://moz.com/link-explorer), and LinkResearchTools (https://www.linkresearchtools.com) are some of the greatest extra sources. 

    Each of these tools, like Search Console, only provides a limited list of the link. Because these software suppliers are tiny and the task of scanning the web as completely as Google is difficult, it should come as no surprise that they do not cover the whole web.

    Building a database using the combined data from all of these tools, on the other hand, will provide a more comprehensive list of linkages. 

    During a research of link tool suppliers, Perficient discovered that combining these data sources resulted in discovering twice as many links as the vendor with the biggest index of links (https://blogs.perficient.com/2021/01/26/study-who-has-the-largest-index-of-links/). 

    Of course, there will be a lot of overlap in what they display, so make sure the list is deduplicated. Even combining all of these sources, however, is insufficient. 

    In Search Console, Google only discloses a subset of the links it is aware of. The other link providers rely on their own company's crawls, and crawling the whole Web is a huge operation for which they simply do not have the resources.  
     

     Cleaning Links using Tools.

     
     There are technologies that may assist speed up the process of removing problematic connections by automating the process of detecting them. Remove'em (https://www.removeem.com/) and Link Detox (https://smart.linkresearchtools.com/new/link-detox) are two of the most popular. These tools may be able to assist you in identifying some of your faulty connections. 

    However, you should not depend only on these tools to complete your tasks. Each program has its unique methodology for detecting problematic connections, which may save you time when evaluating all of your links. 

    Keep in mind, however, that Google has spent over 15 years perfecting its algorithms for analyzing connections, and it is a major element of its business to do so efficiently, including identifying link spam. 

    Third-party technologies will fall short of Google's algorithm in terms of sophistication. 

    They can discover some of the problematic links, but not all of the ones that you'll need to fix. 

    You should evaluate all of the connections, not only the ones that are designated as dangerous, but also those that are just questionable or even harmless. Use your own judgment rather than relying on the tools to judge what is good or harmful for you. Disavow Links is a program that allows you to remove links from your website. 

    You may disavow links using a service provided by Google.


     (http://bit.ly/disavow_links). The Disavow Connections tool informs Google that you no longer want particular links to get PageRank (or any other advantage). This provides a strategy for reducing the harmful effects of poor links referring to your website. Manual Actions (Penalties) on Google 

    There are two ways to lose traffic: 

    1. Google algorithm adjustments and human measures. 
    2. Changes to algorithms are not punishments, and they do not entail any human intervention, while manual penalties do. 

    While the specifics of what causes Google to undertake a manual assessment of a website aren't always clear, manual reviews tend to be triggered in a variety of ways. Note that although an algorithmic ranking adjustment may occur in certain situations, these are not regarded "penalties" by Google. 

    The following is a list of the primary probable triggers:  

     

    Submit spam.

     
     Any user (even your rival) may report spam to Google (http://bit.ly/report webspam). While Google hasn't said how many of these complaints it gets on a daily basis, it's probable that they get a lot of them. Google reviews each report and undertakes a human assessment if it considers one trustworthy (it may use an algorithmic verifier to decide this). 
     

    Review initiated by an algorithm. 

     
     While Google has never confirmed this method, it's probable that algorithms are used to prompt a human evaluation of a website. The idea is that Google employs algorithms to discover huge numbers of sites with potentially harmful conduct, but not severe enough for Google to punish them algorithmically, therefore these sites are queued for human review. Custom algorithms might potentially be used by Google to flag sites for evaluation. # 
     

    Regular evaluations of search results. 

     
     Google has a big staff of employees that manually check search results in order to assess their quality. This project is mainly meant to offer feedback to Google's search quality team, which will be used to improve their algorithms. However, it's feasible that this method may be used to select specific places for additional investigation. When a review is initiated, a human reviewer looks at a set of criteria to see whether a penalty is warranted. Whatever the result of the investigation, it's probable that Google will preserve the notes in a database for future reference. Google is expected to preserve a record of all webmasters' past transgressions, whether or not they result in a penalty.  
     

    Google Penalties and Manual Actions.

     
     There are numerous different types of manual punishments. Thin content and link-related penalties are the most well-known forms of penalties, but you may also earn a number of additional punishments. The following sections go through some of the most prevalent forms of manual punishments. 

    Google has two important sites that will help you understand the various sorts of penalties and what they mean: 





    The content of these two pages, which outline the sorts of activities that lead Google to have problems about your site, is a crucial element of any SEO approach. 


    Here are some of the most typical penalties that websites can face: 


    Penalties for having insufficient material. 


    This penalty is applied to pages that, in Google's judgment, do not provide enough value to users. Unfortunately, when you obtain a penalty like this, Google doesn't provide you any information about what caused it. It does inform you that you are being penalized for having insufficient content, but the rest is up to you. 

    Thin-content penalties are triggered by four main factors: 

     

    Pages containing little or no valuable information. 

     
     Pages with very little content are possible causes for this penalty, as the name implies. This is particularly true if there are a lot of these pages or if there is a portion of the site where a substantial percentage of the pages are considered thin.

     

    Thin slicing. 

     
    This occurs when publishers create pages only for the purpose of attracting search traffic. These publishers often create pages for each possible search term a visitor may use, even if the content changes are minor or irrelevant. Publishers often mistakenly achieve this by auto-generating content pages depending on searches visitors type while utilizing the website's search feature. If you decide to implement anything like this, you'll need a thorough review process for weeding out these thin-slicing versions, as well as a single version of the page to concentrate on.  
     

    Doorway pages. 

     
     These are pages that seem to have been created only for the purpose of monetizing people who have arrived through search engines. These pages may be identified by the fact that they are frequently solitary pages with minimal follow-up material, and/or they are pages that are primarily produced for search engines rather than people. When a user lands on these sites, he or she has two options: purchase now or leave.  
     

    Inadequate integration with the rest of the site. 

     
     Another thing to check for is whether or not sections of your site are nicely integrated with the rest of it. 

    Is there a straightforward method for people to access these pages from the home page, the site's primary navigation, or at the very least a key portion of the site? 

    A thin-content penalty may be imposed if a piece of your site looks to be separated from the rest of your site. You must file a reconsideration request after you think you have remedied these concerns. 

    More information is available in the "Filing Reconsideration Requests" section below. After you've submitted your request, all you have to do now is wait for Google to respond. 

    Normally, this procedure takes two to three weeks. If you succeed, you're in excellent condition; all you have to do now is make sure you don't go overboard again in the future. 

    Otherwise, you'll have to go back to the drawing board to see what you may have overlooked. 

    Penalties for partial links. A partial link penalty is another potential manual punishment. As part of the warning you get from Google, this is commonly referred to as a "impacts links" penalty. 

    These penalties mean that one or a few of your pages have been marked for poor linking practices. Normally, this penalty has only a little impact on the ranks and traffic of those specific pages. 
     

    Link penalties that apply to the whole site. 

     
     Manual link penalties may be issued to the whole site. This typically suggests that more than a few sites are implicated, and it might even imply that the site's main page is affected. 

    The publisher's sitewide rankings are decreased as a result of this punishment. 

    As a result, the quantity of traffic lost is usually much more than with a partial link penalty.  
     



    Other Types of Manual Actions/Penalties.

       
     

    Cloaking and/or sneaky redirection.

     
     If Googlebot thinks you're presenting different versions of sites to Googlebot than you are to users, you'll receive this notice. 

    To troubleshoot this, obtain the page using Search Console's URL Inspector tool. Use the tool to compare two pages by loading the same page in a different browser window. If you don't have access to Search Console, the Mobile Friendly Test Tool is the next best thing . 

    If you see disparities, put in the time and effort to find out how to get rid of them. 

    You should also look for URLs that redirect people to pages that aren't what they expected to see—for example, if they click on anchor text expecting to read an article about a topic they're interested in but instead land on a spammy page trying to sell them something. 

    Conditional redirection, where people who come via Google search or a certain range of IP addresses are diverted to different sites than other users, are another possible cause of this issue. 
     

    Keyword stuffing and/or hidden text. 

     
     This alert appears if Google suspects you of cramming keywords into your sites to manipulate search results—for example, if you place information on a page with a white background and white text, which is invisible to humans but visible to search engines. Another strategy to send this message is to simply keep repeating your page's core keyword in the hopes of affecting search results. 
     

    Spam created by users. 

     
     This penalty is imposed on websites that accept user-generated content (UGC) but are deemed to be performing a poor job of quality control on that material. 

    It's fairly typical for spammers to target sites with user-generated material by submitting low-quality content with links back to their own sites. 

    Identifying and removing the spammy pages is a short-term solution. The longer-term solution is to set up a system for analyzing and removing spammy material before it enters your site in the first place.  
     

    Unnatural links from your site. 

     
     This means Google thinks you're selling links to other parties or engaging in link schemes to pass PageRank. The solution is simple: either delete or add a nofollow tag to any links on your site that seem to be sponsored links. 
     

    Master Security Issues Report

     
    MSIR is a document that lists all of the security issues that have been identified Google will notify you of the penalty by giving you a notice in Search Console and/or by displaying warnings in the search results that your site has been hacked (and is unsafe to visit). 

    Failure to keep up with upgrades to your content management system is the most prevalent source of this penalty (CMS). 

    Spammers use weaknesses in the CMS to alter your web pages, usually to include links to their own sites, but sometimes for more sinister goals like as gaining access to credit card data or other personally identifying information. To fix the issue, you'll need to figure out how your website was hacked. 

    If you don't have any technical personnel on staff, you may need to seek assistance in detecting and correcting the issue. Keep your CMS updated to the most recent version available to limit your risk in the future.  
     

    Pure spam.

     
     If Google feels your site is utilizing particularly aggressive spam methods, it will display this alert in Search Console. This may include things like automatically created nonsense or other approaches that don't seem to be aimed at adding value to people. If you get this notice, you should probably shut down the site and start again. 

    Spammy Freehosts.


    If a substantial fraction of the sites utilizing your hosting firm are spamming, Google may take action against all of the sites hosted there, even if your site is clean as a whistle. Make certain you're dealing with a reliable hosting firm. You must address the root of the complaints in order to solve any of these issues. Follow the method indicated in the section "Filing Reconsideration Requests" when you feel you have done so. 

     

    Diagnosing the Cause of a Traffic Loss

     
    Checking your analytics data to verify whether the decline is due to a loss of organic search engine traffic is the first step in determining the source of a traffic loss.

    If you have Google Analytics, Adobe Analytics, or another analytics package installed on your site, double-check your traffic sources and then isolate only the Google traffic to see if that's what's gone down. 

    Whether you've confirmed that the decline in Google organic search traffic is due to a Google penalty, the next step is to see if you've gotten a notification in Google Search Console stating that you've been punished. 

    If you've gotten one of these messages, you now know what the issue is and how to remedy it. It's not nice to have a problem, but understanding what you're up against is the first step toward healing. If you don't have such a message, you'll have to dig a little more to figure out what's wrong. 

    The next step is to pinpoint the precise date when your traffic began to decline. There are a number of programs on the market that may be used to determine whether there were any important Google changes on that particular day. 



    Here are eight tools that you may use to do this: 

    Mozcast Mozcast may be found at https://moz.com/mozcast/

    History of Google Algorithm Changes https://moz.com/google-algorithm-changes 

    RankRanger Rank Risk Index Tool https://www.rankranger.com/rank-risk-index/ 


    RankRanger Rank Risk Index Tool https://www.rankranger.com/rank-risk-index/ 

    'Grump' Rating on Accuranker https://www.accuranker.com/grump 

    Algoroo Advanced Web Rankings https://algoroo.com/ 


    Cognitive SEO Signals https://cognitiveseo.com/signals/ 

    If you haven't received a warning from Google Search Console and the date of your traffic loss does not coincide with a known Google algorithm change, determining how to recover is significantly more difficult since you don't know what caused the decline. On a regular basis, Google makes minor modifications to its algorithms. 

    These are minor tweaks rather than big improvements, according to it. Even these, though, might have a major influence on your site's traffic, whether favorable or bad. If they have a negative influence on you, such changes may be considerably more difficult to reverse. 

    Google makes daily modifications in part because it enables them to make tiny improvements on a regular basis while also running a range of tests to enhance the algorithm. 

    The breadth of these changes may sometimes reach a point where the industry notices them, and you can see lively conversations about what's going on on Twitter or in key search industry publications like Search Engine Land, Moz, Search Engine Journal, and others. 

    Google confirms some of these upgrades while others are not. Regardless, any of these may have a significant influence on your site's traffic. 
     

    Requesting Reconsideration for Manual Actions and Penalties. 

     
     Only fines are subject to reconsideration petitions. You won't be able to make a claim to compensate for traffic losses unless you have a manual penalty. 

    The second thing to keep in mind regarding your reconsideration request is that it will be reviewed by someone, and that person will most likely be reviewing a big number of them every day. 

    Complaining about what has occurred to your company or being combative with the reviewer will not assist your case. 



    The greatest strategy is to keep it brief and sweet: 



    1. Describe the situation in a few words. If at all feasible, provide some statistics. 

    2. Describe the problem. For example, if you were unaware of the regulations, just admit it and inform them that you have now learned them. Say that if you had a rogue SEO agency conduct shoddy job for you. 

    3. Describe how you resolved the issue: If you had a link penalty, tell them how many links you were able to delete. Tell them if you did anything unusual, such as deleting and/or disavowing all of your connections from the previous year. Statement acts like these may make a big difference and increase your chances of succeeding. 

    4. Make it clear that you plan to follow the Webmaster Guidelines in the future. Keep your reconsideration request brief, as previously said. Cover the important points briefly, and then submit it using the Search Console account linked with the penalized site. In reality, sending it from an account will result in a manual penalty.  
     

    Timeline for requesting a reconsideration.

     
     After you've submitted your request, you'll have to wait. The good news is that you should hear back within two to three weeks. Hopefully, your efforts will be fruitful! If not, you'll have to start again from the beginning to find out what you missed. # 
     
    Recovering from Traffic Losses That Weren't Caused by a Manual Action or Penalty.# 
     
     Reconsideration petitions are only available if you have been charged with a penalty. All you can do for the rest of the reasons for lost traffic is make the changes to your site that you think will help you recover and wait. To see what modifications you've made, Google needs to crawl your site again. Even if you've made enough adjustments, it might take Google many months to notice enough of the new or removed pages to tip the scales in your favor. 
     

    What if you don't make it back? 

     

    Unfortunately, if your results don't improve, it's likely that you haven't done enough to solve the problems that caused your traffic loss. 

    Don't rule out the potential that your development team made modifications that make it tough for Google to crawl your site. 

    Perhaps they changed the platform on which the site is built, utilized JavaScript in a manner that conceals material from Google, used Robots.txt to prevent content from being indexed, or any other technical problem. 

    If this isn't the case, you'll need to maintain investing on the aspects of your site that you believe are linked to the traffic decline, or that will help you raise the value of your site more widely. Take charge of the problem by committing to making your website one of the greatest on the Internet. This needs a lot of imagination and vision. 

    To be honest, it's not something that everyone can do without major time and financial effort. 

    One thing is certain: you can't afford to take shortcuts when it comes to mitigating the effects of Google traffic losses. If you've put in a lot of time and made a lot of changes, but you still have material that needs to be improved or other areas of the site that need to be improved, chances are you haven't done enough. You could find yourself four months later wishing you had persevered with your rehabilitation. 

    Furthermore, the Google algorithm is always changing. Even if you haven't seen a drop in traffic, Google's message is clear: sites that deliver exceptional content and excellent user experiences will be rewarded the most. 

    As a result, your best bet is to be enthusiastic about designing a site that does both. This is how you may increase your chances of recovering from traffic losses and avoiding future Google upgrades. 
     

    Ending Thoughts.

     
    Manual actions/penalties or algorithmic upgrades that result in traffic losses might have a big effect on your company. As a result, understanding Google's ever-changing Webmaster Guidelines (http://bit.ly/webmaster_best_practices ), creating appealing websites that suit the demands of the end user, and promoting these websites with legitimacy and longevity in mind is vital as a digital marketer.





    The Top Five Social Media Mistakes

               Here's a list of the biggest mistakes I've seen business owners make with social media and how to avoid them.

    1. Talking One-Way: Many business owners start posting status updates because they think that is all they need to do to grow their company online. But the way they do it cuts off any chance of having a two-way conversation. In today's messaging marketplace, consumers want to be heard. If you are just talking to customers but not letting them to talk back and engage with you, then you are wasting considerable time and effort online.

    When you go online and post in a status update area, do not just talk at people; speak with them. Tag people in a post and ask them a question. Tagging simply means that you write directly to a person on his or her Facebook wall or Twitter feed. On Facebook you put the "@" sign in front of their profile name. For Twitter, this sign would go in front of their username.

    Related: 20 Ways to Make the Most of Your Social Media Marketing

    Also, take a few minutes to stop by the "neighborhood" of each social site that you frequent and say hello, find out what your neighbors are up to and post a quick reply. By actively engaging in these spheres, you keep your business top of mind.

    2. Not Knowing When to Ask for Business: Many online businesses have conducted conversations with their connections for quite some time now, without translating this dialogue into any sales. Some companies fail to ask for business online or they ask for it too soon. You need to build some rapport first. People will buy from you only as much as they trust you. Set up a rule to convert conversation into clients or customers.

    I follow the 3/3 rule, whereby I speak with someone no more than three times, for not more than three minutes on each occasion, freely offering tips, exploring another company's branding or directly helping them, before I ask that person for some business. When I do the asking, I send the prospective customer a closing script or a post to indicate how I can help further.

    3. Shiny Object Syndrome: With all the flashy new websites and with social networking capabilities changing by the minute, no wonder you are swept up in checking out a new site or a fresh feature when you go online. But instead of spending countless hours exploring new dazzlers, devote only a set amount of time each day or week to review the new happenings online. Otherwise you will be sucked into a vortex of shiny objects and before you know it your week is over and you have not converted any online relationships into profits. Flag interesting sites or novel capabilities in a folder or on your calendar to revisit later for research and development.

    4. Poor Messaging: A consumer can become overwhelmed by dealing with all the wrong messages that are crowding the Internet lately. Company owners are projecting the wrong image through what they say online. In some cases, their posts have absolutely nothing to do with their company, brand or personality.

    Too many entrepreneurs do what I call panic posting -- just posting for the sake of posting and sharing ideas that do not highlight their overall brand image. If you have a serious company, don't post jokes and funny videos. Instead, post statistics and updates about your company's team members. If your business has a relaxed image, inject humor into your posts. A funny YouTube video can go a long way.

    Related: 10 Laws of Social Media Marketing

    5. Sales Faux Pas: Writing how much your products or services cost in a status update or post is not only a time waste, it is plain wrong. Would you walk up to someone before you have even introduced yourself and say that your latest product is now available at a certain price for a limited time? If so, you would probably end up not only talking to yourself (the person would walk away), but also you likely would lose the entire room of people as customers.

    Try sharing the pros and cons about your industry or product category and ask people to provide feedback and participate. This is one way to bridge the distance between you and your prospects and get them involved with your company's brand. Ultimately newfound fans will promote you without being asked because they feel included. The fact that you asked and listened goes a long way.

    Whether yours is a one-person business or it has 150 employees, take time every month or quarter to examine your social media practices. You could save thousands of dollars and hours -- and have more to show for it.

    Five Truths About Social Media Marketing

    Size matters. It just does. Much like the size of a company's email list has obvious importance to a brand, so does the distribution a brand has on Facebook. Of course, quality of userbase is of utmost importance. Having a large, engaged group of self-identified "fans" or "followers" on Facebook represents a highly valuable distribution channel. Take, for example, American Express. They have over 2 million Facebook fans, or 2 million people to whom they can deliver customer service, notify about new offers and engage with on a recurring basis.

    Ultimately, social media is about sharing, and sharing to a vacuum is useless. The more people signing up to view your message, the more likely you'll be able to effectively cultivate and monetize these relationships.

    The medium is the message. The medium is completely tied to the message in social media -- the two are inextricably linked. This isn't an issue of substituting technology in place of relevant brand messaging. Rather, this amazing "new media" (we'll get to that point later) has given brands and marketers an opportunity to position their products and messaging in a unique way. The best brands are doing a phenomenal job of seamlessly integrating the two, and the best and largest platform, Facebook, is working tirelessly to empower brands in every way possible. (Check out facebook.com/marketing, facebook-studio.com, and Facebook communities like Clinique, Starbucks, Audi and American Express.)

    Social media gurus really do exist. They certainly do. I'd qualify many of the talented social-media marketers and Facebook employees I've interacted with as social media gurus. And if you need names, consider Gary Vaynerchuck from Wine Library TV and Nick O'Neill from AllFacebook.com. If you're suggesting that too many people are trying to own the title of social media guru, then I can agree with that. However, there are incredibly bright people innovating within social media. Consider these folks; they're gurus and worth engaging with.

    Social media is 'new' media. Yes, textbook-marketing principles (the 4 P's, Porter's 5 Forces, etc.) are still the backbone of brand marketing, and still hold significant weight today -- as they should. However, the past few years have proven that certain traditional forms of marketing and advertising are yielding way to this wild and crazy "new media" (see the magazine, newspaper and radio industries for more info.) The best social-media marketers are expertly displaying the basics of marketing and their corporate goals within this "new media" -- be it with likes, hashtags or check-ins.

    Social media can be effectively outsourced to a PR firm. If you want to qualify that statement by saying that hiring a PR firm doesn't necessarily equal social-media success, then I would agree. However, there are many PR firms and social-media agencies that consistently make sure they understand a client's values and goals before publishing to the social-media ecosystem on their behalf. We really like what Rockfish Interactive is doing Bicycle Playing Cards, for instance. Rockfish, a digital-marketing agency that's based in Rogers, Ark., recently helped the 143 year-old playing-card maker relaunch its online social presence on Facebook, Twitter and YouTube. These firms are doing amazing things to harness the power of social media for their clients.

    How to Build a Winning Brand?






      Of all the startup brands, Starbucks still represents the gold standard.

      Starbucks made the mundane act of buying a cup of coffee into an experience. 

      It did so by creating a memorable brand: a unique name and a memorable logo that made coffee not just coffee, but a welcoming, comfortable place to go and be seen.

      The Starbucks brand created a culture. 

      Here's a look at how yours can do the same.

      Step 1: Craft your image

      Creating a brand perception requires intrusion. You are trying to position yourself with people who don't want to change their purchasing decisions. 

      Your brand must be powerful enough to force them out of their routines.

      It all starts with a name. With enough frequency of the message, any name can become memorable. That could be a name that explains, like Jiffy Lube or Toys"R"Us. 

      Maybe it's a made-up word or obscure reference, but one with the power to create a lasting emotional connection (think Starbucks again). 

      Obscure brand names are unique from their competition and often become among the most memorable. It could also be a family name, which implies the person behind the brand name has a credibility to be in this business, a pride of workmanship and a moral standard.




      Your logo is just as important as your name. 


      The logo is the first visceral connection the consumer makes with the brand. It triggers the brand perception. 

      The first measure of a logo is that it answers questions: 


      1. Who are you? 
      2. What do you do? 
      3. What's in it for me?

      There are other practical considerations in logo design:


      1. It must reproduce well in various sizes and media.
      2. It should reflect the sensibilities of the target audience.
      3. Its intention and message should be perfectly clear.
      4. It should be easily and uniquely recognizable.

      At its best, a logo should convey an emotional connection as well as personality. The cleverness in a conceptual logo should get a reaction--an "aha!" --while conveying what you do and capturing the personality of your business.

      Step 2: Get Known

      Branding happens in the minds of consumers. The promises behind the brand create its appeal, but getting the word out is still what brings in the customers.




      Traditional media exposure--advertising, promotion, trade shows, direct marketing, events, directories and even search-engine marketing--costs money, and most startups don't have much. Social media is a great equalizer for the cash-strapped entrepreneur. 

      Here are some fundamental guidelines for building your brand online effectively using Twitter, Facebook, YouTube, blogs and other social media outlets:

      1. Listen, don't just talk. The days of saying anything that comes to mind or reporting what you're having for dinner are over. Hear the conversation first, then participate.
      2. Ask, don't tell. The goal is developing an exchange. Force your opinion and you'll end conversations before they begin.
      3. Be real, and have a story. Behave in the character of the brand. Give the character depth and be genuine.
      4. Be interesting, and give. Add to the conversation by offering up whatever knowledge you have.
      5. Be interested, and respond. Hear a person's need, then share expertise in a personal way that has no motivation other than to help.
      6. Have a payoff, and say thank you. Reward your followers with something special and exclusive. Appreciate them for following your brand and letting you into their world.

      Step 3: Know What the Customer Wants




      In launching a business with limited funding, the potential for successfully establishing a brand is far too often based on the zeal of the entrepreneur's belief in the disruptiveness of the unique business idea rather than market intelligence. That doesn't usually work.

      To increase your brand's chances for success, you need to know five things:


      1. How strong is the perception of your brand, and what would make it stronger?
      2. What is the true level of consumer satisfaction for competitor brands?
      3. Will your brand introduce emotional connections with consumers who do not currently exist in the market segment?
      4. What percent of the market will consider change because of the disruptiveness of your product?
      5. How much awareness can you gain for the brand?

      Brands that Delight


      One of the observations of market research firm Brand Keys, which compiles the annual Customer Loyalty Engagement Index, is that "delight is the new differentiator." 

      Here are the index's top 10 brands for evoking customer delight:


      1. Netflix
      2. Apple
      3. Walgreens
      4. Discover
      5. Hyundai
      6. Mary Kay
      7. McDonald's
      8. J. Crew
      9. Samsung
      10. Nikon

      The answers to those five questions will determine your chances for successfully branding your product. There are various methods for conducting consumer research, like focus groups and e-mail surveys, that determine what would make a consumer recommend your brand to a friend. If cost is a crippling concern, you must at least go out into the market, observe consumer behavior over a relevant period of time and keep tallies of each type of consumer behavior.

      To succeed, you need to know what is the true perception of your brand, how many people hate it, how many it appeals to strongly enough that they would advocate for it and how that acceptance stacks up against the competition. The most successful companies pick a competitive position from which they know their brands can win.


      Brand Win: Go Daddy


      Branding experts groan every time they see a Go Daddy Girl in a tight white tank top appear on their TV screens, which happens between 500 and 900 times a week on cable. The company's edgy branding strategy has little to do with the very unsexy business of domain registration and website hosting, and Go Daddy acknowledges that its suggestive marketing alienates part of its potential customer base.

      But it's hard to argue with the results. The company had 16 percent of the new domain name market. After Go Daddy aired its first Super Bowl commercial that year spoofing Janet Jackson's wardrobe malfunction in the previous Super Bowl, its share jumped to 25 percent within weeks. Six years later, Go Daddy owns more than 50 percent of the market of new domain names, and the company is a household name--even if a big chunk of people still don't know exactly what it does.

      "It sounds so simple, but if something works, I keep doing it, and when it doesn't work, I stop," says Bob Parsons, Go Daddy's founder and CEO, and the brains behind the brand strategy. "The edgier the brand is, the better it works. The point is to keep it fun."

      In reality, though, Go Daddy's branding--including the unusual name and the child-like logo of a man with sunglasses and a star on his head--is classic advertising. It creates curiosity and promotes name recognition, something most tech services have never done well. But what really defines the company's success is what customers discover once they are enticed to learn more.

      "None of this stuff with branding is going to work if you can't deliver," says Parsons, noting that of the 3,000 people on the payroll, 1,800 work in customer service. "We provide the best service of anyone in the world. We even call customers to thank them for $10 purchases."

      Go Daddy has tried other advertising routes, including one appealing to busy moms and another touting its U.S.-based call centers. Neither pushed people to the site like the edgier Go Daddy Girl commercials, which this year feature NASCAR's Danica Patrick and fitness guru Jillian Michaels. Parsons says when the provocative advertising stops working, he'll try something different. Until then, his girls will keep teasing NFL fans and late-night cable watchers.

      "We've taken domains and websites, which is about as exciting as a cup of sawdust," Parsons says, "and made people pay attention."

      Brand Fail: Fit Fuel


      Major magazines were featuring Sean Kelly and Luke Burgis as two of America's entrepreneurial wunderkinds. The duo's business, Fit Fuel, was on the fast track to becoming the next big online retailer. The next big brand in fitness was no longer even a brand. Somewhere between the accolades and fast growth, the company lost its way, and a big piece of the downfall was bad branding.

      Fit Fuel was conceived as a service to help vending machine companies stock their slots with healthy choices instead of chips and soda. But soon after launch, Kelly and Burgis realized the majority of their customers were not other businesses, but regular Joes looking for good prices on PowerBars and trail mix. They ran with it, reshaping the company into a fitness product e-tailer. Growth was exponential, bringing in $5 million per year at its height, and they moved to a giant warehouse in Las Vegas and jumped from five employees to 20.

      That ramp-up proved to be the downfall. Fit Fuel was shifting from selling nutrition products to stocking all things fitness, including books, exercise equipment, apparel and sexual enhancement products. "We were like the Amazon.com of fitness," Burgis says. "But people were confused about who we were, and we didn't have capital. We couldn't survive a price war."

      Sean Kelly left Fit Fuel to focus on the healthy vending machine concept, and Burgis was left to figure out the company's direction. He started modeling his business on shoe e-tailer Zappos, thinking a hip, service-oriented company culture could define his brand. Burgis renegotiated his contracts so they could offer two-day shipping. Customer service was impeccable. But it didn't resonate with shoppers.

      "It was what I wanted the company to be, not what customers wanted," he says. "We put our marketing energy into the wrong things, and we were carrying way too many products."

      After failed takeover negotiations with Zappos the year before, Burgis declared bankruptcy, and Fit Fuel was shuttered. But the experience didn't go to waste. Now at his new venture ActivPrayer, a company that trains coaches to run Christian-focused group fitness classes, Burgis is very keen on what his customers want, and the brand lines up with his strengths.

      "Customers are very interested in who we are and in how we do business," Burgis says. "At Fit Fuel, they just wanted their product in a reasonable amount of time. They didn't care if I was a good guy or not."


      Five Reasons Why Websites Still Matter

                 You know you must leverage Facebook, Twitter and word-of-mouth marketing to increase awareness of your brand. But the fact is, websites remain infinitely more popular with consumers than all of the business pages on social media sites combined.

      Only 22 percent of those of us online in the U.S. visit a branded social networking page such as those found on Facebook, while 62 percent of us regularly visit branded websites, according to the latest Global Web Index report. If you were starting to let your site become outdated or haggard, consider a refresh. After all, as these figures note, websites still matter.

      Here are five reasons why you shouldn't ignore yours:

      1. Branding: Since it's your site, you set the design, which affords you the flexibility to optimize the user experience in ways that directly support your business model and brand-related goals. There's no competition on your website, just a branded experience that you direct yourself.

      2. IT and Engineering Jurisdiction: When you control your own site, you have complete jurisdiction over its code, hosting environment, page count, content, plug-ins and more. Just as I mentioned above with regard to branding -- here too you have the elasticity required to make small or sweeping adjustments at will, an advantage you don't get with third-party websites. With sites like Facebook, you can change minor graphics and some content but not code, navigation scheme, server speed or the graphic user interface.

      3. Content: Speaking of content, more of it can be found on your own website than on a third-party utility or platform, and none of it competes side-by-side for your visitor's attention. Create compelling and useful content that speaks to why someone is visiting your site and you stand a higher chance of that visitor taking action with respect to your products or services. And since inventory (i.e., web pages) is virtually unlimited on a site under your control, you have ample opportunity to add additional content and calls-to-action in the format you deem most appropriate.

      4. Search Engine Optimization (SEO): If garnering multiple, relevant and highly positioned placements in the SERPs (search engine result pages) is part of your sales and marketing strategy, a website is a must. When properly coded and managed, your site delivers natural and sustaining search results that drive qualified traffic to the exact pages on your site where you want visitors to be.

      5. Analytics: While many social utilities, platforms and networks provide access to data related to demographics associated with who accesses your profile and how often they do so, website analytic tools go much deeper. They can provide you with the type of business intelligence you need to determine in real-time how your online marketing performs and stacks up against the competition.

      Don't think for a moment that I'm suggesting you drop social in favor of your own website. What I'm advocating is that you lead first with your website, followed by leveraging social, email marketing, point of purchase, mobile, apps and other forms of marketing and outreach to drive traffic to your website where you can generate qualified leads who convert to paying customers.

      Pivot or Persevere? The Key to Startup Success





      The decision of whether to pivot and when to persist is the primary issue that every entrepreneur ultimately encounters in creating a successful product. 

      A pivot is a planned course adjustment created to test a brand-new core theory about the product, company strategy, and growth driver. 

      Entrepreneurs must regularly ask themselves an apparently straightforward question: 


      Do we need to make a significant adjustment or are we making enough progress to assume that our initial strategic premise is true?


      The erroneous choice to persist is the biggest destructor of creative potential. Companies that are unwilling to change course in response to market input risk being stranded in the land of the living dead, where they consume resources, the dedication of workers, and other stakeholders without making any progress.


      It's not about producing additional features or widgets to increase startup productivity. 

      It involves directing our efforts toward a company and a product that are actively trying to provide value and spur growth. 

      In other words, effective pivots lead us in the direction of expanding a long-lasting company.



      Failure is a necessary component of learning. 


      The issue with releasing a product and then waiting to see what occurs is that you are sure to succeed in waiting to see what happens. 

      But what follows? You'll probably have five ideas on what to do next after you have a few clients. Which one ought you to hear? 

      In order to pivot, we must have one foot firmly planted in what we have learnt so far while fundamentally altering our approach in order to pursue even more verified learning.


      The majority of business owners who have chosen to pivot will tell you that they regret not acting sooner. There are three explanations for why this occurs.


      Entrepreneurs may draw erroneous conclusions and live in their own private universe thanks to vanity metrics.

      It's almost hard for an entrepreneur to really fail while their premise is uncertain, and without failure, there is often little motivation to make the drastic changes necessary for a pivot.

      Many business owners are terrified. Failure recognition might result in dangerously low morale. 

      The greatest concern of most business owners is not that their vision will turn out to be incorrect. The possibility that the idea would be rejected before having a chance to stand its own is even more terrible.


      Entrepreneurs must confront their anxieties and be prepared to fail—often in front of others. 


      In reality, an extreme form of this issue affects business owners who have a high profile, whether due to their own notoriety or the fact that they work for a well-known company.

      A clear-eyed and objective perspective is necessary for making the pivotal choice. For any startup, it is usually emotionally charged and has to be dealt with in a methodical manner. 

      Setting up the meeting in advance is one strategy to address this issue. Every company need to have frequent "pivot or persist" meetings, in my opinion. 

      In my opinion, less than a few weeks or more than a few months between encounters is excessive. Every new business must choose its own speed.

      From Grad Student to Social Media Millionaire

               Skeptics say social media hasn't existed long enough to produce experts. Clearly, those folks haven't met Shama Kabani. The 26-year-old wrote her master's thesis for the University of Texas at Austin about Twitter--when it had only 2,000 users, not the 175 million it has today. She hosts a web TV show about technology. Her 2010 book, The Zen of Social Media Marketing: An Easier Way to Build Credibility, Generate Buzz and Increase Revenue, is the No. 4 seller about web marketing on Amazon.com.

      Related

      1SaleADay.com's Ben Federman and the Value of a Daily Steal
      GumGum's Ophir Tanz Spins Gold from Ads on Images Online
      Khu.sh's Prerna Gupta Pioneers High-Tech Music for the Masses

      52 Reader Comments.
      Share your thoughts

      And those are just the side projects. In 2009, at 24, Kabani founded The Marketing Zen Group, a social media marketing firm in Dallas. The company, which she launched with $1,500 of her own money, specializes in all aspects of web marketing for clients--from Facebook and Twitter to blogs and video.

      "We are in an age where people are tired of the faceless corporate culture," the Texas native says. "Every day, I ask myself the same question: What can I do today to increase value for our trusted audience (blog readers, TV watchers, Twitter followers, etc.), for our team and for our clients?"

      That value means different things for different clients. For k9cuisine.com, a Paris, Ill.-based online retailer of dog food, Marketing Zen established blog and Twitter presences and cultivated relationships with pet-related bloggers. For Arthur Murray Dance Studios in Boston, the company optimized a website to generate more targeted sales leads.

      Marketing Zen wasn't always about social media. Kabani says her original plan was to start a general consulting agency. But she quickly realized her passions--and all the best gigs, for that matter--were in the social media space, so she tweaked her strategy.

      This modified approach was about consulting with clients, telling them how to better market their businesses online and letting them run with it. Meanwhile, Kabani learned two things: Clients wanted not just a consultant, but also a firm that could implement ideas; and social media is only part of a larger marketing puzzle that includes building solid websites and developing smart search engine optimization.

      "That is how we went from being a consulting company to a company that takes over web marketing for our clients," she says. "Our value proposition is simpler now: We drive inbound leads for our clients and increase their online brand visibility."

      Related: How to Grow a Business Organically

      That thinking appears to be working. Kabani declines to share specifics, but she notes revenues grew more than 400 percent last year alone, and she expects Marketing Zen to be a "multimillion-dollar company" by the end of 2014.

      One of the secrets to Kabani's profit model is low overhead. She hired almost all her 30 employees virtually, and many key people work remotely. At least a dozen Marketing Zen employees are in the Philippines.

      Another key differentiator: legitimate engagement. Kabani prides herself on having her clients engage with followers, rather than simply talking at them. For Dave Kerpen, CEO and co-founder of competitor Likeable Media, this is a distinguishing characteristic. "She understands the true meaning of community engagement," Kerpen says. "She gets that the conversation must go both ways in order to satisfy customers."

      Kabani is also giving back: Earlier this year, she was part of a delegation of businesspeople from the U.S. and Denmark that traveled to Egypt to educate young entrepreneurs. It's all part of her far-reaching quest for enterprising youth to tackle the new global business market.

      "The world we live in today is not the world of our grandparents, or even that of our parents," Kabani says. "A college degree does not guarantee success. Young entrepreneurs have to create their own opportunities. The economy needs fresh blood and bold new ideas."

      Why You Don't Want to Be the Low-Cost Leader

                 When you select a pricing strategy--that is, decide how you wish to price your products or services--what is your goal? The first answer that comes to mind may be to maximize profits, but that isn't a good enough answer.

      Think about it this way: When your company develops new products or invests in a new marketing campaign, what's the goal? To maximize profits. But that doesn't tell you what types of products to develop or which customers to target or what message to deliver.

      Both Ikea and Mercedes want to maximize profits--and they use very different pricing strategies to do so--but we don't think of Ikea and Mercedes in terms of their pricing strategies. We think of them in terms of their products and positioning. Ikea is a fun, designer, starter furniture store; Mercedes is a luxury automobile manufacturer.

      Both companies set their pricing strategies to be consistent with their overall goals and the vision of who they are. Price follows their corporate strategy--not the other way around.

      What is your overall strategy? It's the general description of how you compete in the market. It is your sustainable competitive advantage. Your strategy should be based on how your product or service differs from your competition, from product features or location to marketing or the breadth or focus of your offering. It can be many things, but it shouldn't be price.

      Related: How Pricing Can Power a Turnaround

      Why not? Because pricing is not a sustainable competitive advantage. Prices can change almost instantly. Your competitor can change prices just as quickly as you can. What if you find that optimal price, that psychologically perfect price that magically makes all customers want to buy from you? Your competitors will copy it--immediately. Any competitive advantage you may gain with pricing is not sustainable.

      The one time that pricing can be a corporate strategy is when the company is positioned as the low-price leader. That's Walmart. If you adopt low price as your strategy, then your business must be continually focused on lowering and controlling costs--like Walmart. You are attracting the price buyers, customers who are not loyal, but are looking for the lowest price. Once a competitor figures out how to sell a similar product for less, they will charge lower prices and you will struggle. If another company figures out how to sell products for less than Walmart, Walmart will be in trouble. Knowing this, Walmart maintains a laser-sharp focus on keeping costs down. If you make low price your strategy, you have to be like Walmart, continuously lowering your costs so your competitors don't catch up.

      You may be thinking about a different price-based strategy. "My product is as good as a Lexus, but less expensive. I'm going to make that my strategy." Don't do it. You may be able to have that product positioning for a short while, but it's not sustainable. The market will morph, and your position may or may not exist in a few years. You have competitors on both sides of you, above and below, either of which may be able to steal your position, because your position is just price.

      Related: Five Signs It's Time to Change Your Prices

      Consider Walmart's discount retail competition. Kmart is having a difficult time competing with Walmart. Same-store sales continue to decline even as they come out of the 2010 recession. On the other hand, Target's same-store sales figures are growing rapidly. What's the difference? Although there are many factors, one is that Target has a unique positioning. It is described as "trendy," "cool" and "a hip discounter." Kmart may have the Martha Stewart brand, but the company as a whole doesn't own a position. There doesn't seem to be any real differentiation between Kmart and Walmart--other than price, which Walmart wins.

      Target's success isn't based on price. They could not beat Walmart in a low price battle. Target's success is because they own the unique positioning of "hip discounter." There is only room for one company with lowest prices, and that company is Walmart, at least for now.

      The strategy of low-cost leader is a rough-and-tumble position. Everything is done without frills. Once you get too comfortable, someone else hungrier than you will do it with less and steal your position. This is not a fun position to defend.

      Even for companies that aren't low-cost leaders, you must still focus some of your energy and resources on costs. Target, Kmart and every company in a competitive situation still win and lose customers based on their prices. And to have competitive prices, they must maintain relatively low costs. Price is a factor in every customer's decision, and if one company's costs are much higher than another's, then they run the risk of losing on price.

      Richard Branson on Why Biggest Doesn't Mean Best

                 Editor's Note: Entrepreneur Richard Branson regularly shares his business experience and advice with readers. What follows is the latest edited round of insightful responses. Ask him a question and your query might be the inspiration for a future column.

      Q: Do you deliberately pitch the Virgin brand as good value? If your prices are competitive, is it still possible for the public to regard your products as "the best in the world"?
      -- Bobby Hall, Australia
      Related

      Richard Branson on the Art of Delegation
      Richard Branson on the Power of Your People
      Richard Branson on Thinking Big

      4 Reader Comments.
      Share your thoughts

      A: Not only can a small company be the best, but it has to be the best to stand a good chance of thriving in today's competitive world. Then, once it reaches the top spot, it has to strive to do better every day, to ensure customers buy its products or services. Large scale can bring a company many advantages: a hefty marketing budget, established brand awareness in target markets and dependable distribution networks. But, luckily for the smaller players, a business's size does not guarantee better products or great service.

      Twenty-seven years ago, when Virgin Atlantic had just one secondhand 747, we were able to compete with British Airways, which had a large fleet, a massive marketing budget and the dominant position at Heathrow, the U.K.'s leading airport.

      Related: Richard Branson on Finding and Selecting Investors for Your Startup

      We set out to be the cheapest on the block, but only in our specialized niche. I had learned from the collapse of Laker Airways, the British budget airline, that competing on price alone was not a good strategy in the aviation industry. Since its profit margin was meager, the company was vulnerable to larger competitors' price attacks. Instead, we provided "first-class service at a business-class price," which allowed us to generate the profit margin we needed to continue investing in the business while earning a reasonable return.This strategy made a virtue of our airline's small size and modest start, shaping a very strong culture based on great customer service and our not being afraid to try new things. Virgin Atlantic's size was an asset: We were nimble and could innovate quickly, whether we were introducing new entertainment systems, better food, chauffeur-driven cars or our quirky lounges. There was no bureaucracy to slow us down, so we could direct money and resources to the right areas quickly and effectively.

      Our small size also meant that we built close relationships with our customers. A business's key asset in this area (size brings no benefit) is its people. Back then, many great people applied to work for us because we were a small group and had more fun. They were so important to our success that we quickly learned to focus on staff retention, and this paid off. Our employees knew from experience how to provide the best possible service.

      Without a large frequent-flyer scheme or network on our side, we learned to rely on our two strengths: customer service and personality. We drew attention to our unique offering through our famously cheeky ads, which were topical, timely and often poked fun at our competitors. This attracted notoriety and generated brand recognition.

      Overall, we made sure we provided great value for money, building a loyal base of customers who identified with the Virgin brand. Our company soon won market share from British Airways; in essence, we had used a small business's budget to create a big brand. And in the long run, Virgin Atlantic has become one of the strongest brands in aviation.

      Related: Richard Branson on Managing Experienced Workers

      We applied what we had learned when we started up Virgin Blue in Australia and Virgin America in San Francisco. Virgin's culture was now firmly focused on developing products and services from the customer's point of view and trying to do things better than anyone had done them before. We always try to shake up an industry by showing just how high standards ought to be.

      Virgin America will never be the same size as American Airlines or United Airlines -- two of the biggest players in the industry -- but we can out-maneuver and out-think them. The onboard experience we provide to our customers is very different from that of the legacy carriers: we provide great entertainment, free wireless Internet service and great food.

      Since its inaugural flight in August 2007, Virgin America has won a number of awards for service and quality. During Virgin Atlantic's first years, we would have been obliged to launch a massive newspaper, television and billboard campaign to draw attention to these achievements, but advertising has changed a great deal over the past decade. Now we can use the power of social media -- with the help of clever viral campaigns, e-mail communications and online advertising -- to generate support, coverage and sales.

      This change means that a company's large size is no longer a guarantee of its continuing success. With the playing field more level, big brands can no longer rely merely on expensive marketing campaigns to generate sales. Smaller players can build their global presence by using social media and word-of-mouth to promote their services without spending a lot of money.
      This has helped smaller players to punch above their weight. Biggest does not mean best, and it never has. Now, even if they don't have the biggest wallets, small companies can achieve recognition as the best in the world.

      Ways to Build Online Traffic and Boost SEO

                 "If you build it, they will come." It worked in the movies, but just putting up a website is no guarantee that it will draw traffic. You could buy ads, but if you're unable or unwilling to "pay to play," you're likely facing the increasingly daunting challenge of attracting customers to your website on your own.

      Millions of sites are competing for users' shrinking time and attention. The hard truth is that the top three unpaid positions on the first page of Google search results receive about 58% of all clicks, according to online-marketing service Optify. Websites that appear on the second page? An average of 1.5%.

      Your best bet for attracting potential customers without spending money on advertising is creating content of such value that audiences can't help but feel compelled to seek it out or pass it along -- on a site that makes sharing it easy.

      But how? Here are ideas for getting started publishing content that can help you increase online traffic, improve search-engine optimization and possibly go viral in the process.

      Original Articles -- Time investment aside, online articles cost little, can be easy to generate, and provide a way to brand company representatives as experts. Your articles could offer instructional learning, new methods for tackling problems or insight into best practices or industry leaders' views. Here are just a few article formats to consider:

      Essays
      How-to articles
      Tip sheets
      Checklists
      Guidebooks
      Interviews

      Grab readers with arresting headlines, bold statements and an authoritative voice. Use humor and catchy hooks (for example, "5 Ways to Torture and Infuriate Your Employees" or "The Wrong Way to Do Downsizing").

      For maximum impact, build content around popular search-engine keywords, dilemmas and industry memes. Avoid terms or references that might date your articles.

      Keep in mind that less can be more. Be succinct and summarize wherever possible.

      Videos and Podcasts -- Barriers to entering the online "broadcasting" business are lower than ever. Armed with a portable digital video camera ($100-$600), USB microphone ($20-$200) and a spare hour of time, nearly anyone can create compelling short or extended-length shows. Ideally, videos are best kept under three minutes and audio recordings to five to 15 minutes. You'd be amazed how quickly content can be built and distributed. Even a simple webcam can provide a ready vehicle for recording.

      Here are a few ideas for videos or podcasts:

      Behind-the-scenes footage from your office
      Making-of style documentaries
      Product demonstrations
      Customer testimonials
      Webinars
      Q&As
      Panel discussions
      Uniquely-branded video programs ("Engineering 101")
      Custom training segments ("Launch Your Leadership Career")
      Exclusive sit-down interviews ("A Conversation with Seth Godin")

      All should be stamped with your logo, prominently feature business contact information and be promoted on aggregators like YouTube, Vimeo and Metacafe. It's also vital that users be allowed to pass along links and embed them on their own websites, as well as access and download audio recordings through popular online distribution services such as iTunes and Podcast.com.

      Blogs, Forums and Online Communities -- Your company wouldn't be in business if it didn't employ subject-matter experts in your field. A simple way to build trust, cement credibility and grow both reach and renown is to allow customers ready access to these individuals and their hard-won knowledge. Similarly, courtesy of their own education and experiences, customers may have additional insights, input and suggestions that they're happy to share with colleagues and peers. Tap into a wellspring of ideas, and prospective publishing material, by providing blogs, newsgroups, communities, message boards, polls and other forums where ideas are readily exchanged.

      Such solutions can foster creativity and discussion, provide enhanced user support and allow prospective partners or buyers to communicate with and grow trust in you and your team. They also can offer a two-way channel for conversation that helps you get to better know your customers, understand their needs and stay on top of new trends or interests. The activity and continuously updated content can keep users regularly clicking on your website.

      Charts, Diagrams and Infographics -- Computer generated or hand-drawn, these visually rich pieces can make data easy for anyone to understand and digest at a glance. And fun to share. Need ideas? Consider illustrating customer preferences, buying habits or population distributions.

      Fun facts and especially interesting or one-of-a-kind information won't just draw audiences' attention. They may also provide a ready platform for publicity that can lead to newspaper, magazine, radio and TV coverage.

      Books and Online Guides -- Everyone has a problem that needs solving: That's why businesses exist. Providing expert advice, hints, strategies and answers to perennial questions is an excellent way to establish yourself as a leading industry source and even gain media exposure. In addition to ebooks, you could consider publishing your work in PDFs or sharable online slideshow presentations.

      While you may opt to charge for full or more detailed manuscripts, at least a small initial installment should always be given away free. Just be certain to include information of value. Customers won't want to pass along a glorified sales pitch.

      PostSecret's social network of secrets

                As part of an art project, Frank Warren posted his home address online and asked people to anonymously mail him their secrets on handmade postcards. His idea: post those secrets online, giving people an outlet to say to the world what's on their minds.

      "They can make you laugh, they can break your heart -- but you think when you read all 20 to 25 every Sunday," says Warren, who launched PostSecret six years ago. "It leaves you someplace a little different emotionally than where you were."

      PostSecret's confessions range from everyday exasperations with humorous twists -- like a coffee barista threatening to serve demanding customers decaf -- to disclosures of life-threatening problems such as eating disorders or suicidal thoughts.

      PostSecret isn't just an art project these days. It's now the cornerstone of a business franchise, encompassing a book series, speaking engagements, and a new mobile app that launched last week.

      "I was a small business owner for 20 years," Warren says. "I feel as though not having a background in art but rather in business was helpful for me to, when this website became very popular, grow it and develop it in a way that it could self-sustain itself and find these other platforms of expression."

      Warren, who says the site sees more than 4 million hits a month, attributes much of PostSecret's popularity and longetivity to his commitment to the project's core value: allowing people to share their secrets without exploiting them. That means no ads.

      "If we did have ads, we could generate a pretty good revenue stream," he says. "But I feel one of the reasons so many people -- over half a million -- have trusted me with their secrets is because they know that their secrets won't be exploited or commercialized."

      But Warren's newest venture -- an iPhone app that debuted last week -- carries a price tag. The $1.99 allows users to read and share secrets on the go. The app adds a location layer to PostSecret's confessional network: Users can "pin" an anonymous secret to a location.

      The app has already cracked Apple's top-20 list of bestselling paid applications. More than 100,000 mobile secrets have been shared, says Warren, who is working on an Android app next.

      Like many startup ventures, PostSecret's mobile move is an evolving experiment.

      "It's super organic, so we don't know what kind of conversations are going to emerge," Warren says. "We don't know how people are going to use it."

      He envisions the app as an alternative social network -- "a way of shining light on these hidden parts of ourselves, and in some ways sharing secrets as commerce and currency."

      Some of those secrets are also cries for help. From this week's batch of postcards: "I'm considering death as a solution to the problems that will emerge when my unemployment runs out. All of the people in my life who think I'm handling this so well will realize how wrong they were."

      Warren -- who spent several years answering phone calls overnight at a suicide prevention hotline -- is exploring ways to use PostSecret's growing community for advocacy and aid. The app could help shine a light on data clusters that would otherwise stay hidden, he suggests.

      "We notice at a certain campus, perhaps, a lot of students are struggling with issues of abuse or eating disorder or stress ... there's a way we can talk to that school and have them offer more of their resources to students, or make them aware of what's available to help," he says.

      But even for the solipsistic, PostSecret can be an illuminating mirror.

      "One of the things I've learned form this project are there are two kinds of secrets. There are the ones we keep from others, and the secrets we hide from ourselves," Warren says. "Sometimes the more exposure you have to other peoples authentic secrets, the more you're able to look inside yourself and understand some parts of your life you need to deal with."
      Click here to visit the site

      Amazon: No California sales tax collection til 2012

                 Amazon has struck a deal with lawmakers that will give the company one more year before it must start collecting sales tax on purchases made in California.

      As part of the agreement, which was reached late Wednesday, Amazon has promised to back off fighting a new state law that would have compelled it to collect the tax.

      Now Amazon has until July 31, 2012, to lobby Congress for a federal solution.

      Right now, online retailers face a patchwork of local laws, requiring them to collect sales tax in some states but letting them skip it in others. Amazon has fought hard against local levies but says it would support a "simple, nationwide system of state and local sales tax collection."

      California's sales tax kerfuffle started back in June, when California Governor Jerry Brown approved an $86 billion budget that imposed deep spending cuts. Under that law, Amazon and other out-of-state online retailers would be required to collect California sales taxes if they had affiliates, offices, workers or other ties to the state.

      Amazon's affiliates program provides a commission to website or blog operators who refer shoppers to the retailer's site. Amazon, which has had associates in California for more than a decade, works with 10,000 affiliates there.

      Brown's measure was expected to add $200 million to California coffers. But the same day the bill was signed into law, Amazon (AMZN, Fortune 500) announced it would terminate its relationship with its California associate. It called the legislation "unconstitutional and counterproductive."

      Instead, Seattle-based Amazon wrote up a referendum challenging the law and spent boatloads of cash collecting signatures for it. Last week, the company sweetened the pot: It offered to build distribution centers in California and hire workers in the state.

      "Amazon throws around a lot of money," said Assemblyman Charles Calderon, a Democrat, who helped broker the agreement. "It's not your typical corporate community. They floated their own proposal. They have an aggressive business plan, and they are wiling to take risks."

      Calderon calls the 12-month push down the road a "grace period," and he said "a national solution" from Congress would make things simpler for everyone.

      Amazon hasn't said whether or not it will reinstate its California affiliates. The company did not return a call seeking comment.

      Meanwhile, other states have passed the so-called "Amazon tax" in recent years. Those include Connecticut, Illinois, New York, North Carolina, Arkansas and Rhode Island. Amazon dropped its associates program in all those states, except New York, where it has a brought a lawsuit against the state. To top of page
      Click here to visit the site

      Analysts scramble to raise their iPhone and iPad estimates

                  It happens every three months. As the end of Apple's (AAPL) fiscal quarter approaches, the small army of analysts that cover the stock dusts off its spreadsheets, finds them overly conservative and starts issuing revisions. If history is any guide, the numbers will be revised again -- upward -- when the company reports its Q4 2011 earnings in mid-October.

      Here are the reports we've seen so far this week:

      Sterne Agee's Shaw Wu: Sees "remarkable" strength in iPhone 4 sales despite expected release of iPhone 5 in October. Ups iPhone estimate to 18.5 million (from 15.7 million), iPads to 12 million (from 10.4 million) and gross margin to 41% (from 39%). Staying pat on Macs at 4.1 million.
      Jeffries' Peter Misek: iPhone sales are "surprisingly strong" he says, but back-to-school demand for Macs was "weaker than expected." Ups iPhone estimate to 18.9 (from 18.4), lowers Mac to 4.4 million (from 4.9 million).
      BMO's Keith Backman: Raises iPhones to 20.4 million (from 19.5 million), Macs to 4.31 (from 4.27 million) and iPads to 11.0 million (from 10.5), adding that he "sees upside tension to 11.5 million units," whatever that means.
      Pacific Crest's Andy Hargreaves: Sees "significant upside" to the numbers in his spreadsheet but doesn't seem quite ready to change them. Could easily imagine iPhone sales going to 24.05 million (from the 18.7 million in his model) and iPad sales going to 16.5 million (from 11.1 million).

      For the record, the estimates we've seen so far from independent analysts (whose track record is considerably better than Wall Street's) are higher on iPhones and Macs and lower on iPads. On average, they're calling for 24.9 million iPhones, 4.85 million Macs and 10.5 million iPads.
      Posted in: Analysts, Apple, iPad, iPhone, Mac, Macintosh, Mobile, Tablets
      Click here to visit the site