Five Tips for Making Better Decisions

              Making a decision is one of the most powerful acts for inspiring confidence in leaders and managers. Yet many bosses are squeamish about it.

Some decide not to decide, while others simply procrastinate. Either way, it’s typically a cop-out -- and doesn’t exactly encourage inspiration in the ranks.

To avoid pining over what to do and what to skip, it can help to learn how to make better decisions. You’ll be viewed as a better leader and get better results overall. Here are five tips for making quicker, more calculated decisions:

Stop seeking perfection. Many great leaders would prefer a project or report be delivered only 80% complete a few hours early than 100% complete five minutes late. Moral of the story: Don’t wait for everything to be perfect. Instead of seeking the impossible, efficient decision makers tend to leap without all the answers and trust that they’ll be able to build their wings on the way down.
Be independent. Good decision makers are “collaboratively independent.” They tend to surround themselves with the best and brightest and ask pointed questions. For instance, in a discussion with subject-matter experts, they don’t ask: “What should I do?” Rather, their query is: “What’s your thinking on this?” Waiting for committees or an expansive chain of command to make decisions could take longer. Get your information from credible sources and then act, swiftly.
Turn your brain off. Insight comes when you least expect it. Similar to suddenly remembering the name of an actor that you think you'd just plumb forgotten. The same happens when you’re trying to make a decision. By simply turning your mind off for a while or even switching to a different dilemma, you’ll give your brain the opportunity to scan its data bank for information that is already stored and waiting to be retrieved.
Don’t problem solve, decide. A decision can solve a problem, but not every problem can be solved by making a decision. Instead, decision making often relies more on intuition than analysis. Deciding between vendors, for instance, requires examining historical data, references and prices. But the tipping point often rests with your gut. Which feels like the right choice?
Admit your mistakes. If your feelings steered you wrong, correct the error and fess up. Even making the wrong decision will garner more respect and loyalty when you admit you’ve made a mistake and resolve it than if you are habitually indecisive.

Peer Lending Grows as Business Owner Option

           When John Good, owner of the Bubbles Galore Car Wash in Davison, Mich., went to his local bank last year to get a $16,000 loan to expand into the self-serve dog washing business, he was denied.

First Place Bank, a subsidiary of First Place Financial (FPFC) in Warren, Ohio, already held the note for Good's original $500,000 Small Business Administration start-up loan, but the bank required massive documentation and fees -- requirements Good felt were too costly, time consuming and frankly, annoying, given the amount.

"The amount of money we were requesting didn't merit the amount of work and back-end costs," Good says.

That's when Good heard about peer-to-peer lending.

Peer lending is not necessarily new, but there are a growing number of business borrowers turning to them -- in part because, as interest rates remain near record lows, investors are looking for options that will make decent returns.

In the U.S., the two largest peer lending sites are Prosper.com, which has funded $214 million in loans since 2006, according to its website, and 3-year-old Lending Club.

Lending Club's chief executive, Renaud Laplanche, sees an increasing number of investors looking to invest in more small businesses, which frees up more cash for loans. Last year, the overall average initial investment through Lending Club rose to $8,700 by December from $1,800 in March, the company says. Funded loans recently passed the $200 million mark, doubling its volume versus nine months earlier.

Lending Club is targeting an average initial investment of $15,000 by year-end.

"The No. 1 reason why borrowers choose Lending Club as opposed to a bank is lower interest rates. We are a peer-to-peer lending network and therefore create a more efficient way of getting funding to borrowers, whether small businesses or individuals," Laplanche says.

How It Works
Lending Club offers a maximum of $25,000 in either three-year or five-year maturities. Borrowers are allowed up to two loans in active repayment.

Money to fund the loans comes directly from qualified investors (with at least $70,000 in annual income and $70,000 in net worth), not lending institutions.

Applicants must be at least 18 years old, with a valid bank account, a FICO score of at least 660 and debt-to-income ratio at most of 25 percent (excluding mortgage), among other requirements, according to the company.

Applicants fill out an online application and are told immediately whether they passed the initial screening. If they pass, they are given loan options and their confidential request is posted to the website for two weeks or until the loan is fully funded.

"The process is simpler because we do not underwrite the business itself, we just underwrite the business owner," Laplanche says. "The business owner can get a loan based on his own financial situation and his own credit history. It's fully automated and it's no different from getting a credit card or getting a personal loan from a bank."

Origination fees are between 2 percent and 5 percent of the loan amount.

Annual percentage rates, which includes the loan interest rate and fees, average about 11 percent for three-year loans and 14 percent on five-year loans. That compares with an average 15 percent to 24 percent at banks, Laplanche notes.

Websites such as Lending Club also become a sort of clearinghouse to borrowers by having someone in the middle saying "this investor is legit."

But borrowers need to understand that risks are involved, since peer lending investors are not subject to the same rigorous guidelines as banks, warns Marilyn Landis, founder of small-business consultancy Better Business Concepts and a board trustee to the National Small
Business Administration.

Borrowers need to be extra careful when agreeing to loan terms -- perhaps even bringing in an accountant or counsel to review the terms. Landis also suggests that applicants find out whether they can repay the loan early and, if so, if a cost is involved.

On the other hand, "because they are not regulated, [investors] could be friendly to your industry or friendly to you," Landis says. "You may have a more patient, flexible, more workable mentor that you would not otherwise have."

Good, the owner of the car wash/self-service doggie wash, got his $16,000 loan last year through Lending Club and says he would absolutely use peer lending again.

It took less than seven days from the time he submitted his application to Lending Club to get the funding, Good says. Several potential investors asked questions, with the most wary asking why Good's company didn't have cash on hand to fund the expansion. (Because that cash is also an emergency fund, Good says, and "Nobody wants to drain that emergency fund.")

The diversification has paid off, Good says. He's looking to fully repay the loan at month 20, nearly a year and a half earlier than expected.

Freemium: Is the Price Right for Your Company?

              In 2006, venture capitalist Fred Wilson asked readers of his AVC blog to come up with a name for his "favorite business model," which he described like this: "Give your service away for free, possibly ad supported but maybe not; acquire a lot of customers very efficiently through word-of-mouth, referral networks, organic search marketing, etc.; then offer premium priced, value added services or an enhanced version of your service to your customer base."

From an onslaught of responses, Wilson chose the suggestion of e-commerce executive Jarid Lukin: freemium. "I hope the name sticks," Wilson concluded in his announcement, "because I love it."

Obviously, Wilson chose well. The name definitely stuck ­-- as did the concept, which organizers of the Freemium Summit say is now "the fastest-growing online business model."

How the freemium model works

Do you use LinkedIn, Pandora, Skype, SurveyMonkey or other free online services? If so, even if you never spend a dollar with those businesses, you’re providing value they can ­-- and do -- monetize.

When you and 80 million others post free personal profiles on LinkedIn, you contribute to a membership that advertisers, recruiters and lead-seeking professionals want to pay to reach. LinkedIn obliges this demand with ad packages, premium subscriptions and ­hiring solutions that together have resulted in positive cash flow for the past two years.

Or consider the Internet radio service Pandora, whose users primarily listen for free. So how does it make money? For a long time, it didn’t. Then in 2008, Pandora launched a free app that allowed listeners to stream music. Within months, subscriptions soared to 40 million, and 2009 revenues ­-- from ad sales, premium-level subscriptions, iTunes and Amazon.com payments on user purchases, and deals to integrate Pandora into other sound systems -- climbed past $50 million.

Freemium on a small-business scale

Leo Babauta, author of the book 'Focus: A Simplicity Manifesto in the Age of Distraction," is proof that turning free into profit isn’t just for those who serve millions — nor is it all that "complicated. In fact, Babauta achieves success while following his own simplicity advice.

Instead of providing contact information, Babauta’s website points to his Twitter feed, stating, “I generally don’t do email.” He does, however, generously share his path to success, using his website and blog to explain that his self-published book “comes in two flavors: free and premium.” You can download the free version “without having to give an email address or do anything else,” he writes. “It’s uncopyrighted, and you can share it with as many people as you like.”


Where does the money come from? "I have a premium digital version," he explains in a blog post, "which has extra … videos, audio interviews with experts and bonus PDF guides. Enough people have bought it after reading the free version that it’s already a great success."
advertisement

Freemium lessons to learn from

Addressing the 2010 Freemium Summit, Brent Chudoba, vice president of SurveyMonkey, shared a key piece of advice: He said his business "has never spent a dime on marketing or sales. We had to find a way for usage to drive conversion."

But if you decide to develop a freemium model, keep these best practices in mind:

- Offer a base-level product people want to rave about. Virality drives success.

- Prompt user-to-user recruitment with a product that’s more beneficial when shared with others. For example, Skype calls are free when placed to other Skype members.

- Be sure your offering is scalable so serving each new user gets incrementally less expensive.

- Offer a premium-level product that provides meaningful value otherwise difficult to obtain, realizing that many businesses succeed with only single-digit free-to-paid conversion rates.

- Commit to ongoing user interaction in order to adapt and improve offerings, promote usage, inspire evangelism and prompt premium-level purchases.

- Provide users with a reason and easy way to spread the word, recruit new users and grow your audience into one that advertisers and others are willing to pay to reach.

Do all that, and you too can turn free into revenue, and revenue into profit. No wonder Fred Wilson loved freemium at first sight.

Five Rules to Rebound from Failure

             Failure of any kind can be a setback for entrepreneurs, but it doesn't have to spell disaster. I'm the perfect example. I've been rejected by the Marines and I flunked out of law school. The real kicker came in 1998 when a former business partner at a debt-collection company of mine was convicted of fraud. Even though he admitted to committing the fraud without my knowledge, I was indicted on 57 felony counts and my assets were frozen. While I was cleared of all charges four years later, I wound up filing for bankruptcy protection and lost a personal fortune in business equity to the tune of about $3 billion. My only asset left was my house.

Sounds devastating, right? But despite my failures I have been able to pick up the pieces and come through it all with a strong self-image. I attribute that to having a healthy perspective on what failure should and should not mean to me.

When faced with any setback, here are five rules that have helped me over the years and can help you, too.

Don't pretend it never happened.
People are often so anxious to avoid the stigma of failure that they refuse to admit what happened. Denial usually results in a host of other problems, including internal stress and delaying any effective remedy.

The late Dale Carnegie, a well-regarded lecturer and author of the bestseller "How to Win Friends and Influence People," said that when you're quick to admit that you screwed up, your peers will stop holding your feet to the fire and actually begin to comfort you.



Avoid making excuses.
Some people wiggle past the truth by admitting to a problem they sugarcoat in excuses. I was one of them. At one point during my teenage years I was homeless and an alcoholic. At every turn, I told myself that all my shortcomings were not my fault.

My situation only improved when I stopped making excuses and focused on a productive goal. For me, it was getting my General Educational Development (GED) certificate.
Don't confuse a failed goal for a failed person.
Sometimes people take the opposite approach from what I just described. They blame themselves for any and every failure, creating a pattern of negative self-reinforcement. Assuming you'll invariably screw up is dangerous thinking -- and can become a self-fulfilling prophecy. Instead of setting up a mental pattern for failure, ask yourself how you can improve.
Remember, you are not alone.
People fail to reach goals all the time. Take baseball players, for example. They strike out multiple times over the course of a long, 162-game season. And when they fail, they do it in front of millions of TV viewers. The point is that we're not robots. Everyone's bound to stumble every once in a while.
Focus on the lessons learned.
While I ultimately cleared my name after a felony indictment more than a decade ago, a lot of damage had been done. The only way to survive such a world-class level of failure is to focus on the future. Not many people can say they've literally lost billions of dollars and chalk it up to "business lessons." I'm currently rebuilding my company, which now has a portfolio valued at $100 million.

Posted on Sept.16, 2011 by: Jai Krishna Ponnappan

New Rules of Business Marketing

              When David Meerman Scott first published The New Rules of Marketing & PR (Wiley) in 2007, Facebook was still mostly for college students. The book helped Scott, then 46 years old, make a name for himself as a marketing strategist. Even so, he had to add new chapters and rewrite a considerable portion of the book for its recently-released third edition.

So what are the new, new rules now?

Reaching people online is no longer a nice-to-have -- it's a must-have, Scott says. When people search for products or services to buy, they use the big search engines like Google and Bing, as well email and other social media sites like Twitter and LinkedIn to ask their friends and family for advice on purchases. "It's essential for entrepreneurs do a great job at marketing their business using the tools of the web," he says.

But many small businesses are being left behind. Nearly half of small-business owners surveyed recently say don't use social media at all for marketing or any other business purpose, according to Hiscox, a business insurance company. What's more, only 15 percent of small-business owners consider mobile marketing "very valuable" to their operation, another recent report shows.

Related: Six Tips for Mobile Marketing to Engage Customers

Here, Scott offers his top three tips for navigating the new world of business marketing.

1. Don’t hype your product or service. Instead, identify the people who have problems that your product or service can solve, Scott says. Then segment those prospects into buyer personas. Understand those people's problems so that you can create information online that helps to market your business to them by means of how it can solve those problems.

For example, if you own a hotel and only talk on your website about how nice looking it is or how great the location is, that only gets you a small part of the way, he says. The key is to create individualized content for each of your buyer personas, starting from the problems and the buyers. That will help get your business indexed higher in online search results.

2. Share useful content on social media. Once you create great content that addresses buyer problems, share that information on sites such as Facebook and Twitter. But don’t stop there, Scott says. You should also listen and respond to what people are saying on important topics and about your business.

Related: Is Your Business Ready for Video?

If you own a hotel, for instance, and a couple says on Twitter that they are considering a wedding at your hotel, offer a tour of your property, Scott says. You can also join conversations about your industry, even when it's not specifically about your business.

3. If traditional advertising channels are generating sales for you, don't pull back. "Marketing strategies haven't changed," he says. "Nothing is going away."

If you are still spending on offline campaigns, Scott recommends tying them to online efforts. One way is to include a URL or a scannable quick response code in a print ad that links to a page on your website. If you own a restaurant, you can send people to your menu page, for example.

For an example of a small business that's had success marketing online, consider this example that Scott cites in his book.

Related: How to Promote Your Business Blog with Social Media

A marketing automation software company called Eloqua sent a timely email last year after its biggest competitor, a company called Market2Lead, was acquired by Oracle. The owner of Eloqua heard about the deal as it was happening and wrote a blog post about why it was good for the industry. Eloqua then sent the post in an email to every company in its database that was a Market2Lead customer. They were so quick to send it, the email was the first many Market2Lead customers had heard about the acquisition.

As Scott notes in the book, Eloqua told him that emailing the blog post convinced a number of Market2Lead customers to become Eloqua clients, generating about $1 million in new business.

Five Tips for Mobile Marketing Beyond Text-Message Ads

                What meant simply text-message ads. No more. Now a range of free and low-cost technology can help you market products and services to mobile consumers in a variety of new ways.

Of course, mobile ads remain effective, but they're just a small part of a comprehensive mobile-marketing strategy. New to mobile marketing? Consider these five ways a small business on a budget can take its marketing on the go.

1. Develop a mobile-friendly website.
An essential first step is to make sure your small business website looks good and performs correctly on mobile devices. If you do nothing else for mobile marketing, do this. That's what I advise my clients as a marketing consultant. Anything else you do to market to mobile customers will be icing on the cake.

Take the time to test your site on an iPhone, iPad, Android smartphone, a BlackBerry, and other popular devices. There are many different tools and companies that can help small businesses create mobile-friendly websites. A few low-cost and easy-to-use options worth investigating include MoFuse ($7.95 to $199 per month), Mobify (free to $1,000 or more for complete ecommerce mobile sites), and Wirenode (free to $259 per month).

2. Use location-based apps.
Consider how you can geo-target your audience using the local features inherent in GPS-enabled mobile devices. For example, an increasing number of consumers are using free mobile apps such as Where, Google Places, Yelp, and MerchantCircle to find local businesses while they’re out and about. Visit these websites and claim your business on them. Ask customers to publish reviews on these sites, too. I’ve had clients who have experienced business increases of up to 10% within a couple of months of claiming their businesses and collecting reviews on localized websites with mobile apps.

Related: Five Ways to Win a Sale Using Your Customer's Mobile

An excellent free mobile-marketing app for small businesses with brick-and-mortar locations is Foursquare. You can easily create your own Foursquare company page and offer check-in specials, discounts, and frequent-visitor deals. If your customers enjoy mobile gaming, and mobile gaming is consistent with your brand, then the free Gowalla app can be a great mobile app for brand building.

3. Create your own mobile app.
If you have content or functions that mobile consumers could use at least once a week, then a custom mobile app could be a great option to help you connect with them. It doesn’t have to cost a fortune to develop a mobile app of your own. Prices could range from a few hundred dollars to $10,000 or more, depending on the type of app you create and the developer you work with.

Your mobile app can help drive sales through real-time promotions and bring in foot traffic through local marketing. It can provide quick and inexpensive customer service by offering answers to common questions. Make it a game and it can even add some fun to users’ lives. It’s up to you to create a mobile app that matches your customers’ needs, your brand promise, and your business goals.

4. Use quick-response codes.
QR codes are those black-and-white squares that look like a box of pixels and appear on websites, email messages, ads, posters, packages, window decals, and other marketing materials. When people scan QR codes with their mobile devices, they are typically taken to a website where a specific message or offer is provided. The QR code scanning and traffic is tracked, so it’s easy to calculate your return on your marketing effort.

Related: Six Tips for Mobile Marketing to Engage Customers

There are many free and affordable websites that help you create and track QR codes. ScanLife and iCandy both allow you to create QR codes and track scans for free. Custom price quotes are provided to businesses that need more comprehensive tracking and management. For simple QR code creation without tracking capability, Kaywa and Zxing offer free bare-bones QR code creation tools.

5. Publish mobile content.
Companies of all sizes are publishing content for their target audiences to build relationships that lead to sales, loyalty and word-of-mouth marketing. These efforts should be integrated into all areas of your marketing plan, including mobile.

For example, you might write an ebook and publish it online in PDF format for sharing. Further, you could offer it on Amazon’s Kindle ereader device for mobile reading (learn about Kindle royalty rates). If you publish a podcast or online radio show, make it available on BlogTalkRadio (free), Podbean (free to $39.95 per month), or another site that enables you to easily make your podcast content available on iTunes for mobile listening (or publish your podcast to iTunes directly).

If you send marketing email, make sure those messages are mobile-friendly. A growing number of people view their email on their mobile devices. If your email message doesn’t load correctly or displays poorly on a smartphone or tablet, you’ve lost your chance to connect with those customers.

More people purchase smartphones and tablet devices every day, and annual sales of mobile devices have surpassed sales of personal computers. Your customers are using these marketing-friendly gadgets, or will soon be, so don’t wait to create a mobile-marketing plan. While you hesitate, your competitors could already be connecting with the mobile audience.

Understanding Google's New Sitelinks

            When it comes to search engine results, your company's placement acreage atop a Google search result page is what matters. Thanks to some tinkering by the search engineers over at the Googleplex, your website's Google search results now have a much better chance of standing out from the crowd.

Just in case you missed it, Google has updated the way sitelinks are displayed in search results. Sitelinks are those hyperlinks to subpages on your website that appear under certain Google listings (see image below for an example):

When sitelinks began appearing on search engine results pages back in 2006, they were an abbreviated version of what you see below. The big mystery was how exactly Google determined which of a site's sub-pages were promoted, let alone why some listings included sitelinks while many others did not.

Fast-forward five years and these expanded links -- formerly displaying only the title of the page being linked -- now contain two dozen or so characters associated with a subpage's title.

Looking at the listing above, you may find yourself shaking your head at some of the links displayed, like "We Love Logistics." And that's because the Google selection process is pretty much a crapshoot. They're automated based on the link structure of your website. You don't get a say in what link appears below your search listing. You can't even select the links that you'd like to see at the top of a search result for your company or brand.

The selection, it turns out, is based on Google's proprietary algorithms, and those don't always jibe with what you as a business owner or marketer may want to express to online searchers. Google knows this, says it's working on it, and has gone so far as to offer webmasters a means for removing sitelink URLs that you'd prefer not to see associated with a search for your company or brand.

To demote a sitelink URL, follow these five steps:

Login to your Google Webmaster Tools account
On the Webmaster Tools Home page, click the site you want to edit.
Under Site configuration, click Sitelinks.
In the For this search result box, complete the URL for which you don't want a specific sitelink URL to appear.
In the Demote this sitelink URL box, complete the URL of the sitelink you want to demote.

If your business doesn’t have a Google Webmaster Tool account, you can sign up for free on the Webmaster Tools Home page.

Once you've “demoted” unwanted sitelink URLs on your brand search, take the time to eyeball your website page titles and descriptions. As you can see in the example above, the sitelink versions are much shorter than those found within your site's source code. That means you're going to have to edit carefully to get important words in a 25-character or less description.

These newly minted sitelinks, featuring a URL and a smidgeon of text, give your potential customers a much better overview of your website's content -- including areas of the site they might not even be aware of. The bonus to you is the capability to monitor your search and site analytics and discover what it is exactly that your visitors are seeking from your website.

Pay-Per-Click Tools for Small Businesses

            Managing online paid-search-term campaigns can be like water torture for a small-business owner: A slow drip of tedium, choosing keywords and deciding what to pay for each on services like Google AdWords and Microsoft adCenter.

For the uninitiated, paid-search campaigns involve advertisers paying a fee, usually based on clicks or views, to have their links placed high on search-engine results pages. They typically bid on keywords or keyword phrases. Users can find themselves guessing at the words those searching for your products or services might enter into Google, Bing, Yahoo or other search engine. All for the prospect of having your short bit of linked copy appear across the top and on the right side of a web-search results page.

Bigger companies often have help from pricey pay-per-click automation and management services and perhaps professional search marketers. But small and midsize businesses face a tougher task in finding affordable support for paid-search marketing. Programs exist, but none are easy in my view. Or even that affordable. So to get a feel for the best choices in a tight market, my assistant Alex and I tried out three lower-cost paid-search marketing tools.

Related: Pay-Per-Click Return on Investment Calculator

Our take? Yes, there are some paid-search marketing tools that can help, but pay-per-click marketing is still no trivial matter. Here's what we found.

ClickSweeperFree 14-day trial; paid service starts at $50 per month for up to 1,000 keywords
Advertisement

ClickSweeper

What you get: A relatively deep, but affordable, pay-per-click bid-management tool. ClickSweeper, by Santa Clara, Calif.-based Varazo, supports Google, Yahoo and Microsoft accounts and offers a nice set of features to optimize your keywords. Four automated bidding strategies let users prioritize keyword bids based on cost, ad ranking, number of conversions or return on investment. There are analytics tools that can increase the cost, and ways to manage actual ad copy and create performance alerts. You can also generate reports and graphs to track which keywords work and which don’t.

Why you might like it: It’s flexible. Overall we found that ClickSweeper strikes a good balance between automatic bidding and user control. You can let the tool do the bidding for you, or if you need to micromanage a few keywords, you can enter bids manually. There is a nice sense of direct control over your spend.

Why you might not like it: It’s complex. That’s partly due to the nature of the pay-per-click beast, but there are numerous menus, tabs and options to set for every keyword. So gearing up the service can feel as onerous as trying to manage your AdWords campaign with no help. ClickSweeper does offer a set of tutorial videos. They’re dry and watching them takes time, but they can get the job done.

What to do: If you are outgrowing Google’s Adwords tools, ClickSweeper is logical step. Just be sure you give yourself plenty of time -- and patience -- to figure it out.

WordStream for PPCFree 7-day trial; paid service starts at $299 per month to manage up to a $10,000 monthly ad spend.

WordStream for PPC

What you get: A great suite of Google AdWords campaign-building tools. Boston-based WordStream offers a pay-per-click management platform that lets users easily build ad campaigns from scratch or fine-tune campaigns with some cool keyword analysis features.

Why you might like it: Ease of use. WordStream simply shines at managing keywords. A long list of powerful keyword research tools helps you decide how to build your campaigns and write ad copy. And WordStream does a nice job of suggesting new or related keywords, and recommending words to avoid. We especially liked the way the tool helps to effectively group keywords, one of the trickiest parts of search-engine marketing.

Why you might not like it: Simplistic keyword bid management. WordStream does a good job of tracking how keywords perform, but users might miss the opportunity to assign complex rules and goals for bidding that are available in some other services. So you can waste money, unless you have a firm grasp of your bid strategy.

What to do: For ongoing paid-search-marketing efforts, Wordstream makes a lot of sense. It offers a nice mix of cost and features for a more sophisticated pay-per-click marketing effort.

Related: AdWords Express Takes Pain Out of Local Online Advertising

ClickableFree 15-day trial; paid service starts at $499 per month.

Clickable

What you get: What amounts to an entry-level, top-end paid-search tool. If your business invests significant money in paid-search marketing, then Clickable is for you. You get a top-line PPC management tool that works with Google, Yahoo, Bing and even Facebook. It even -- for an additional $300 per month -- will assign an employee to help you design ad strategies -- that’s actually an affordable option, considering the cost of paid search.

Why you might like it: Clickable offers a powerful mix of features well suited to most small business needs. It generates daily bid recommendations based on revenue goals. Custom reports track and compare whatever data you’d like and turns it into a neatly branded presentation. The bulk keyword editing tool quickly manages your ad copy and campaigns simultaneously across different search engines, which can be handy for an advertising blitz. And social media gets its due: Facebook marketing tools also help your business break into what some are calling “F-Commerce.”

Why you might not like it: While Clickable may look affordable compared with sophisticated paid-search marketing, it isn't low cost. Expect to spend about $10,000 a year. And you still might feel constrained. Bottom line: Clickable may not be the best choice for smaller shops or those just wading into paid search marketing.

What to do: If you are looking for value over a full-service paid-search marketing agency, or if you feel comfortable running your own paid-search marketing internally, Clickable is an intriguing option. Just make sure you know the pay-per-click market, and have the money to invest. With up-front costs this steep, a return on investment might be tough to find.

Measuring Your Business's Success on the Web

           Once you invest in a website, you want to make sure it's creating real business value. You don't need expensive software or a degree in statistics to measure your site's success, but with all the data available, it's hard to know what to keep track of and what it all means.

Fortunately, all you need are simple (and free) tools that'll help you focus on a few key data points.

What are your goals?
Before you can know what to measure, you need to clarify your goals. Why do you have a website? If you sell products online, this part is easy. But often, it's a bit harder to pin down. You typically want to make a short list of revenue and engagement goals.

Examples of revenue goals:

People can buy something via the website.
Visitors can request more information or an appointment.
People can get information to visit your physical location.

Examples of engagement goals:

Visitors find the site through a search engine.
People view more than one page of the site.
Visitors sign up for your RSS feed.

Advertisement

Tools you need

Web analytics program. This is critical for measuring the success of your site. There are many choices out there, ranging from free to costing thousands of dollars a month. The easiest to use and install is Google Analytics, which is free.
Google and Bing webmaster tools. With both Google and Bing, you simply verify that you own the domain in order to access data the search engines have about your site.

What to focus on

1. Is the number of qualified visitors increasing?
The best kind of traffic is qualified traffic. "Qualified" visitors are those who are potential customers. How can you find out if visitors are potential customers? In cases other than people coming to your site directly (because they already know about it), you can look at the traffic source.

Traffic from unpaid searches
This is traffic that results from someone doing a search and clicking on your site. Once the site has sizable search traffic, you can start monitoring and categorizing the search keywords people are using.

For example, are people searching for your company name? One keyword category should be "branded" to include all variations of your company name and website. Your other categories should relate to your industry. For instance, if you sell pool accessories, you might have keyword categories for chemicals, slides, inflatable toys and FAQs. You can set these categories up with your webmaster tools and view them in reports to find out what customers are searching for online.

Traffic from referring sites
An increase of traffic from referring sites usually means that people are linking to your content because it's valuable. The pages being linked to most often are probably also the most useful. For instance, if your blog post describing how to add a waterfall to your pool is getting traffic from 20 referral sites, while your post on how to choose a pool cover is only linked to by one site, then maybe more people are interested in waterfalls than pool covers (assuming you promoted both articles the same way).

2. Do visitors find the site useful?
Not everyone is looking to buy something. Some visitors might want to learn more about your company or products. If you track only sales, you may lose valuable insight about how well you are engaging potential customers.

Increased sales
Still, the most obvious and useful metric to track is revenue. You can track goals in Google Analytics (you may want to have an analytics expert set this up) to monitor events such as checkouts or clicks on buy buttons when you sell products online.

When you tie this to your keyword categories and referring site information, the data become even more useful. Do 10 percent of visitors who come from links on online parenting forums purchase from your site, but only 1 percent of visitors who come from gardening forums buy something? If so, parents might be more valuable to your business.

Increased engagement
Using the keyword categories you set up, you can see what kinds of visitors are finding your site valuable. The following data can be useful:

Time on site and pages viewed. These two metrics are often related. You might set a goal of at least 30 seconds on site if you want visitors to read through an entire article. Or you might set a goal of at least two page views per visit, which indicates that visitors are finding the site interesting enough to dig deeper.

You can also track the bounce rate -- that is, how many visitors are leaving the site as soon as they land on it because it didn't have the information they were looking for.

RSS subscriptions. This metric is particularly useful for a blog. If someone comes to your site from a search or referring site, reads a blog post, then subscribes to your blog, it's a good indication that your article was deemed useful. You can monitor subscriptions via goal tracking in Google Analytics.

3. What are your search performance metrics?
Using your keyword categories, you can also monitor how well your site is ranking in search engines for specific searches and if people are clicking through to your site.

Both Google's and Bing's tools track impressions (the number of times searchers saw your site in the results), clicks, click-through rates, and average positions for keywords that send traffic to your site. By looking at these metrics, you can see if your search rankings are improving, as well as if people are finding your search results compelling.

Getting started
The key to monitoring activity on your website is to not get overwhelmed by data. Make sure you have an analytics package installed on your site and are registered for the search engine tools. Be clear on the goals of the site -- and then start with basic questions: What's my overall traffic? What sites are linking to me most often?

Soon, you'll have great insight into how your website investments are paying off.

How Will Google Affect SEO?






    Updates to Google's Algorithms and Manual Actions/Penalties 


    Google fine-tunes and changes its algorithm on a daily basis, while major algorithm improvements are released on a regular basis. 

    Furthermore, they actively check their results to identify sites that are breaching their standards (https://bit.ly/webmaster_best_practices), and such sites may face ranking penalties as a consequence. 

    All of these approaches are intended to aid them in improving the overall quality of their search results. These algorithm tweaks and penalties may sometimes have a significant influence on your organic traffic. 

    A significant drop in search engine traffic to your website may be detrimental to a company's bottom line. 

    This sort of revenue loss may necessitate layoffs or perhaps the closure of the company. As a result, you'll need a basic grasp of how the Google ecosystem works, how Google suggests you run your website, and the many circumstances that might result in lost visibility and traffic. 

    Otherwise, Google updates or fines may have an influence on you, and it may seem that this occurred due to circumstances beyond your control. 

    However, if you have a good grasp of what Google is attempting to do with its algorithm adjustments and penalties, you may dramatically decrease your exposure to them and perhaps set yourself up to avoid fines and gain from the improvements. 

    If you've already experienced a traffic loss as a result of an upgrade or penalty, it's critical to know what caused it and what you need to do to recover. 

    Updates to the Google Algorithm. 



    Google's numerous search algorithms are updated in a variety of ways, including changes to search functionality, changes to search result composition and style, changes to various parts of relevance and ranking algorithms, as well as daily testing and bug corrections. In this part, we'll look at the many sorts of adjustments Google makes and how they affect the search results that users interact with.  
     

    BERT.

     
    Google announced the existence of BERT (https://www.blog.google/products/search/search-language-understanding-bert/) on October 25, 2019. Bidirectional Encoder Representations from Transformers (BERT) is a neural network-based natural language processing approach (NLP). 

    This is what Google had to say about BERT's impact: "BERT will help Search comprehend one out of every ten searches in English in the United States, and we'll expand this to new languages and locations over time." 

    Prior to BERT, when Google's algorithms were attempting to figure out what a word or phrase meant, they could only look at neighboring text that came before or after it. 

    It was essentially unidirectional. BERT allows Google to grasp the meaning of a word or phrase by analyzing the text before and after it. BERT was initially solely applied to US language inquiries, with Google claiming that it had an effect on 10% of those queries. 

    They stated on December 9, 2019 that BERT has been expanded to include 70 languages. https://searchengineland.com/bert-is-rolling-out-to-google-search-in-over-70-languages-326146 In addition to BERT, Google also released a paper on SMITH, a novel algorithm. This algorithm has the potential to be the next step after BERT. 

    What SMITH may bring to the table is the ability to comprehend lengthier sections inside large texts in the same manner that BERT comprehends words and phrases. It was unclear if the SMITH algorithm had been brought up into Google Search as of November 2021, but it shows that Google is still looking on ways to enhance natural language processing.  
     

    Subtopics and Passages.

     
     Google stated on October 15, 2020 that they will be releasing two new search algorithms (https://www.blog.google/products/search/search-on/). 

    The first of these was an algorithm that allowed them to categorize their search results into subjects and subtopics. The idea for this came from Google's discovery that, in many situations, wide user inquiries are quickly followed by other questions aimed at narrowing down what the user is seeking for. 

    For example, if a user searches for "Home Exercise Equipment," Google may provide some initial results as well as subsections for "Affordable Exercise Equipment" and "Small Space Exercise Equipment," since these are common follow-up questions. 

    In January 2021, Google's Danny Sullivan revealed that the subtopics algorithm was launched in mid-November 2020. 

    Another of the algorithms disclosed was one that would allow them to recognize and "index" certain portions inside a web page independently of the rest of the page's content. 

    The goal of this modification was to enable them to respond to highly precise customer inquiries. The importance of this

     approach stems from the fact that many customer requirements are quite detailed. While the answers to these questions may be found in a variety of locations on the internet, many of them are hidden among other information whose overall relevance may not be well aligned with the individual user inquiry. 

    With this change, Google will be able to distinguish certain parts within a bigger text that are relevant to a particular query. The first version of the Passages algorithm was pushed out on February 11, 2011, according to Google's Danny Sullivan (as @SearchLiaison on Twitter). 
     

    Core Web Vitals and Page Experience.

     
     Google stated on May 28, 2020 that they will start utilizing a new signal called Page Experience (https://developers.google.com/search/blog/2020/05/evaluating-page-experience). 

    This was followed by a series of explanations about the new signal's rollout schedule. The Page Experience deployment started in mid-June 2021 and was projected to be completed by the end of August 2021 (https://developers.google.com/search/blog/2021/04/more-details-page-experience). 

    The Page Experience signal is made up of a number of pre-existing signals that all have something to do with whether or not your site provides a positive user experience. 

    Because it makes the idea of Page Experience as a ranking component much easier to handle, Google has combined all of these signals into one bigger score in the overall Google algorithm. 

    Page Experience's total weight may be viewed as a single signal, and the relative weighting of the distinct components can be determined independently of the main algorithm. Furthermore, if Google decides to introduce a new page experience related signal, it may be readily added to the Page Experience signal without affecting the bigger algorithm. 

    While Page Experience is vital, keep in mind that the most significant signals are always content relevancy and quality. For example, simply because your website about tadpoles is quick, it won't start ranking for user search queries concerning cooking pots. 

    Similarly, even if your content is very relevant, a high Page Experience score would not help you rank it. However, there are also occasions when searches are very competitive, with several viable sites offering high relevance and high quality to satisfy the user's needs. In these situations, the Page Experience signal might help you rank somewhat higher than your competitors. 
     

    Update on Link Spam.

     
     Google announced yet another important change in 2021, this time focusing on link spam. The Link Spam Update, as it was dubbed, started rolling out on July 26, 2021 and ended on August 24, 2021. In a blog post titled "A reminder on qualifying links and our link spam update" (https://developers.google.com/search/blog/2021/07/link-tagging-and-link-spam-update), Google detailed the nature of this modification. 

    While the article does not expressly address what the link spam upgrade addressed, it does begin with a discussion of affiliate links and guest blogging difficulties. 

    This includes a reminder on the importance of using link tags like NoFollow, Sponsored, and UGC where applicable. This isn't to say that other parts of link spam weren't considered, but it does indicate that they were the primary emphasis. 

    What it does show is that Google is still grappling with link spam, and although they have made significant progress over the years, there is still space for improvement. 

    Updates to the Broad Core Algorithm. 


    Google started announcing what it terms Broad Core Algorithm Updates in March of 2018. (BCAU). Since then, Google has been unveiling significant changes on a regular basis, and as Sullivan pointed out, they occur multiple times every year. 

    It's also worth noting that these Google verified changes are ones that they think important enough to confirm, but Google releases many more updates about which they choose not to comment. 

    Aside from these verified improvements, the industry has highlighted a slew of additional dates when Google's algorithm tweaks seem to have had a greater effect. 

    These unverified changes may have a big influence, with a lot of websites gaining or losing traffic as a result. 

    Furthermore, Google updates its algorithms on a regular basis. Google's Danny Sullivan said in July 2019 that the company has made over 3,200 algorithm adjustments in the previous year (https://www.blog.google/products/search/how-we-keep-google-search-relevant-and-useful/). 

    Changes in Functionality

     
    Google makes changes to its search engine on a regular basis. Some of these are also announced. Bug Fixes on Google Because Google Search is such a huge and complicated ecosystem, it's unavoidable that problems may appear from time to time. 
     
     

    Webmaster Guidelines from Google


    If you're the owner/publisher of a website and want to increase your Google traffic, it's important to learn Google's Webmaster Guidelines (https://bit.ly/webmaster_best_practices). These are the guidelines that Google expects webmasters to follow while creating and maintaining their websites. While Google cannot compel you to follow these principles, it may choose to penalize websites that do not. 


    Fundamental guidelines that Google expects webmasters to follow:  

     

    Construct pages with users in mind, not search engines. 

     
     This is a crucial component of any company's online presence. Knowing what your target consumers want, how they search, and how to provide that information in a comprehensible and entertaining manner is smart business, and it's also beneficial for Google rankings. 
     

     Don't mislead your customers. 

     
    Unfortunately, this one has made it here because many websites utilize bait and switch methods to lure consumers into material and experiences that aren't what they anticipated. 

    For example, sites with Cumulative Layout Shift issues (covered in the Page Experience portion of this article) may lead visitors to click on the incorrect area of a page, resulting in a bad user experience. Avoid using techniques to boost your search engine results. 

    A decent rule of thumb is to consider if you'd feel comfortable defending your actions to a competitor's website or a Google employee. 

    "Does this aid my users?" is another good question to ask. 

    "Would I do this if there were no search engines?" Take note of the principle's last phrase. 

    It may seem naive at first, but if you learn that Google actively tweaks its algorithms to locate the sites that serve people the best, it all makes sense. 

    All of Google's algorithms are being fine-tuned to discover the greatest user experiences, thus concentrating your efforts on providing exceptional value for people is tightly linked to increasing your chances of ranking in Google. 

    Consider what makes your website special, useful, or interesting. 

    Make your website stand out from the competition in your industry. 

    A user-centered approach is vital, but not sufficient. You should also attempt to establish a site that stands out, just as you would if you were running a company. 

    Otherwise, there will be nothing on your website that would entice consumers to visit it, and Google will have no motivation to rank it highly in the search results. 

    Google outlines a number of particular rules in addition to these fundamental concepts. These are grouped into two categories: practices to avoid and behaviors to follow. 
     


    Practices to Stay Away From.

     
     
     

    Automatically produced content.

     
     In this case, Google is focusing on pages that are created artificially for the goal of getting search traffic but contribute no actual value. Of course, if you operate a retail site, you may be utilizing your ecommerce platform to produce pages that reflect your product database automatically, but Google isn't concerned with that. 

    This is more aimed towards machine-generated (a.k.a. "mad-libbed") stuff that consumers don't understand. Participating in link schemes. 

    Because connections to your site remain a key component of the Google algorithm, several parties are providing techniques to produce links to your site inexpensively and artificially. Concentrate your efforts instead on attracting links that indicate authentic citations of your site.  
     

    Creating pages with little or no original material.

     
     This may take various forms, including automatically created pages, pages with little or no user value or purpose that exist only to persuade someone to click on an affiliate link, content stolen from other sites, and gateway pages.  

    Cloaking 

     
    Cloaking is "the technique of displaying distinct information or URLs to human users and search engines," according to Google. The reason this is a problem is because some websites were built to provide Google with a rich informative experience that Google may choose to rank, but when users came at the site, they got something completely different. 
     

    Sneaky redirects. 

     
     This is when you utilize redirects to route people to a different page than what Googlebot sees. Users may be sent to information that does not match what they anticipated when they click on a link in a Google search result, much as they were with cloaking.  
     

    Text or links that are hidden.

     
     These are spammy practices that date back to the early days of search engines, in which material is presented on a website in such a manner that it is not visible, such as putting white text on a white background or positioning it far off the page using CSS. A frequent spam strategy with links was to include a link to a page but only use one character as a link, such as a hyphen. 
     

    Doorway pages.

     
     These are pages that were designed exclusively for the aim of attracting search engine traffic, rather than to provide a fantastic user experience. In actuality, they are often produced in large quantities and are poorly integrated with the rest of the website. They might also be programmed to target a large number of search keywords that are quite close but not identical. 
     

    Content that has been scraped. 

     
     Taking information from other websites and republishing it on your own is not just a copyright infringement, but it's also frowned upon by Google. Minor changes, such as the use of synonyms, are also insufficient. If you're going to quote material from another website, be sure to give credit to the original source and add your own unique value.  
     

    Taking part in affiliate programs without offering enough value.

     
     In the past, Google had a lot of issues with sites that made all of their money from affiliate programs finding methods to rank low-quality material in the search results. There's nothing wrong with earning a portion of your income from affiliate programs, or even all of it. If you don't have much valuable material to give consumers, That site will not be ranked by Google. 
     

    Loading pages with keywords that aren't relevant.  

     
     Also known as "keyword stuffing," cramming your website with unnecessary or unnecessarily repeated phrases detracts from the user experience and is considered spammy by Google. Creating dangerous websites, such as phishing or installing viruses, trojans, or other malicious software The reasons for Google's refusal to include certain sites in search results are evident, although they are not necessarily the consequence of the website's publisher's actions. Sites may be hacked, so it's important to be alert about maintaining the security of your site and checking to see if it's been hacked on a frequent basis. 
     

    Structured data markup is being abused.  

     
     Structured data allows you to improve the look of your listing in Google's search results, but it also has the potential to be abused.  
     

    Sending automated inquiries to Google.

     
     This is the practice of sending massive numbers of searches to Google using automated technologies. This sort of activity is often used for rank monitoring, but Google doesn't like it since it wastes their resources with little value to them. Many technologies, such as Brightedge, Searchmetrics, SEMrush, seoClarity, Conductor, and others, provide large-scale rank tracking. Using one or more of these tools may be a very useful part of your SEO strategy, as long as you don't overdo it. Follow These Good Hygiene Practices This list is rather brief and concentrates on two topics that reflect optimum site hygiene procedures. 
     

    Watching for hacking on your site and deleting hacked material as soon as it appears.

     
     Unfortunately, this is more prevalent than you may think. Hackers use programs that scour the internet for security flaws, then use those flaws to inject their code into your web pages, frequently in the form of invisible links to their own. 

    One of the most effective measures you can use to reduce your risk is to maintain your software platform up to current at all times. If you use Wordpress, for example, always install the most recent updates as soon as they become available. This would also include any plugins. 
     

    Preventing and eliminating spam from your site created by users.

     
     Any site that enables users to submit material in any way runs the danger of receiving spammy content. If you enable comments on your material or host forums on your site, for example. 

    Some unscrupulous actors may manually insert spammy information, while others use programs that cruise the internet seeking for areas to leave comments or posts on websites. Some of the best practices here include demanding moderation of all comments or posts, or examining all comments or posts as soon as they are published. 

    There are gradations to this, such as requiring moderation of any user's initial remark or post, but allowing them to submit further material without approval after that. You should, however, make time to evaluate such contributions after they've been submitted. User-generated spam may also be found on freehosting services, which allow anybody to set up websites without spending any money. 

    If you run a freehost platform, you'll need to follow identical steps to guarantee that you don't end up with spammy material on your site. Take a look at Google's Webmaster Guidelines (https://developers.google.com/search/docs/advanced/guidelines/webmaster-guidelines). Anyone who starts to invest proactively in improving their organic search presence should be aware of these guidelines and take actions to ensure that their company does not breach them.  
     

    High-Quality Content 


    Because we, as website content creators, desire Google traffic, it is our responsibility to create high-quality material. This necessitates an understanding of our target audience, how and what they search for, and then providing high-quality content wrapped in an excellent user experience so they can quickly find what they're looking for. 

    However, as you would think, developing high-quality content isn't always straightforward, and many people try to cut corners, which may lead to low-quality or even spam-like material appearing in search results. To address this, Google does and searches for a variety of things to guarantee that low-quality material does not appear in the SERPs. 

    On February 24, 2012, Google made a huge stride forward when it unveiled the Panda algorithm, which was a decade ago. Google said the following in their release statement (http://bit.ly/more_high-quality): 

    Many of the adjustments we make are so little that they go unnoticed by most others. 

    But, in the last day or so, we've implemented a significant algorithmic change to our ranking—a change that affects 11.8 percent of our queries—and we wanted to let people know about it. 

    This update aims to lower the ranks of low-quality sites, such as those that provide little value to visitors, replicate material from other websites, or are just not very helpful. Simultaneously, it will boost the ranks of high-quality sites with unique material and information, such as research, in-depth reporting, and intelligent analysis. 

    The most significant change Panda made to the Google landscape was that it improved Google's ability to evaluate content quality. Downgrading sites that published low-quality material in big numbers in order to garner significant amounts of search traffic was one part of this. Panda, on the other hand, evolved over time to address challenges of material quality on a far greater scale. 

    Panda was formerly a distinct algorithm from the main Google algorithm, however Google declared in January 2016 that Panda has been completely merged into the main algorithm. Google's algorithms continue to prioritize content quality. 
     



    Content that Google despises. 



    The following are some of the main sorts of material that Google believes to be poor:
     

    Content that isn't very thin.

     
    This is described as pages with very little content, as one would imagine. User profile pages on discussion sites with minimal information filled in, or an ecommerce site with millions of goods but little information for each one, are two examples.

     

    Unoriginal material. 

     
    These might be scraped pages or pages that have simply been slightly modified, and Google can readily discover them. Google algorithms may penalize websites with even a modest number of these sorts of pages. 

    Nondifferentiated material. Even if you write 100% unique articles, this may not be sufficient. If every page on your site covers issues that have been covered hundreds or thousands of times previously, your site isn't truly adding anything new to the Web. 



    Poor Quality Content.



    Material that is erroneous or badly constructed is referred to as  poor-quality content.
     
     This may be difficult to notice in many circumstances, but material with bad language or many spelling errors is one clue. Google may also use fact-checking as a means of identifying low-quality material. 

     

    Curated content

     
    Google algorithms penalize sites with a significant number of pages including lists of curated links. Although content curation isn't intrinsically negative, it's critical to include a large amount of meaningful commentary and analysis if you're going to do it. Those with a lot of links will score poorly, as will pages with a lot of links but just a tiny quantity of original material. 
     

    Thin slicing

     
     was formerly a prominent strategy employed by content farms. Let's say you wanted to write about colleges that provide nursing degrees. Many articles on basically the same subject would be published on content farm sites. Creating articles with names like "nursing schools," "nursing school," "nursing colleges," "nursing universities," "nursing education," and so on is an example. There is no need for all of those various articles since they will be identical in terms of content.  
     

    Content produced by a database.

     
    Using a database to produce web pages isn't intrinsically wrong, but many businesses were doing it on a massive scale. 

    This might result in a lot of thin-content or low-quality pages, which Google dislikes. It's worth noting that ecommerce systems effectively generate content from a database, which is OK as long as you work hard to create compelling product descriptions and other information for those sites. 

    Diverse Content Is Important.

     
     For Google, diversity is critical to overall search quality. The search query Jaguar is an easy method to demonstrate this. This term may refer to anything from an animal to a vehicle to a guitar to an operating system to an NFL franchise. 


    The Role of Authority in Content Ranking.

     
     While Google provides a lot of results on a subject, there are a few sites that score well for this search query. What factors go towards determining their rank? 

    When producing information on a subject that is already widely covered on the Internet, really high-authority sites are likely to fare OK. There are a few plausible explanations for this: A lot depends on your reputation and authority. 

    Even if the New York Times Lifestyle section published a new story about how to make French toast, readers could react warmly to it. 

    Because of the site's repute, user engagement signals with the search result for such material would most likely be fairly high. High-authority sites are presumably that way because they don't participate in a lot of the conduct that Google warns webmasters about. 

    You're unlikely to come across a lot of thin material, "me too" stuff, thin slicing, or any of the other flaws that Google's algorithms target. A high-authority site may simply be subjected to a broader set of standards than other sites. It's unclear what characteristics give higher-authority sites greater wiggle room. 

    Is Google evaluating the user's engagement with the material, the content's quality, the publisher's authority, or a combination of these factors? 

    What Google does most likely has parts of all three. 

    Weak Content's Impact on Rankings 


    Even poor content on a single portion of a bigger site might lead Google to reduce the site's overall ranks. 

    This is true even if the material in issue accounts for less than 20% of the site's pages. As shown in Figure 2-13, this may not be an issue if the remainder of your site's content is excellent, but it's better not to risk it if you have known poor sites that are worth the time to fix.  
     

    Improving Content That Isn't Good.

     
     When dealing with thin content, it's essential to delve deep and ask tough questions about how to create a site with a lot of great material and plenty of user interaction and engagement. You want to generate highly distinctive content that people seek, like, share, and connect to on your site. Creating content that people will interact with is a science. We all know how crucial it is to choose interesting headlines for our material, and we also know how vital it is to include captivating visuals. Make it a point to learn how to generate compelling content that people will want to read, and then apply those concepts to each page you make. Furthermore, track your interaction, experiment with alternative ways, and enhance your ability to create amazing content over time.  
     

    Actions to do if your pages aren't performing well.

     
    Addressing your site's weak pages should be a large part of your emphasis when you review it. They might take the shape of a complete section of low-quality material or a few of pages strewn throughout your site's higher-quality content. 

    Once you've found those sites, you have a few options for dealing with the issues you've discovered: Make the material better. This might include rewriting the information on the website to make it more appealing to visitors. The noindex meta tag should be added to the page. 

    This will instruct Google not to index certain sites, thereby removing them from the Panda equation. 301-redirect users to other pages on your site instead of deleting the pages. 

    Only use this option if there are quality pages that are related to the ones that have been removed. When someone attempts to browse a page that has been removed, return a 410 HTTP status code. This informs the search engine that the pages on your site have been deleted. 

    To remove a page from Google's index, use the URL removal tool (http://bit.ly/remove_content). This should be approached with caution. You don't want to remove other high-quality sites from Google's index by mistake!  
     

    High-Quality Links 



    We simply need to look at Larry Page and Sergey Brin's original thesis, "The Anatomy of a Large-Scale Hypertextual Web Search Engine" (http://infolab.stanford.edu/backrub/google.html), to see how Google employs links. 

    This paragraph appears at the start of the thesis: The web's citation (link) graph is a valuable resource that is frequently ignored by present online search engines. 

    We've made maps with up to 518 million of these linkages, which is a large portion of the total. These maps make it possible to quickly calculate a web page's "PageRank," an objective measure of its citation relevance that closely matches people's subjective perceptions of importance. 

    PageRank is a great approach to rank the results of online keyword searches because of this correlation. 

    The notion of a citation is quite important. 


    The citation list is used by the article's author to recognize important sources he used while writing the paper. 

    If you looked at all of the articles on a specific subject, you could pretty quickly figure out which ones were the most significant since they had the most citations (votes) from other publications. 

    Consider what links represent to understand why they are valuable as a signal for search engines. When someone connects to your website, they are inviting visitors to leave their own and visit yours. In general, most website publishers want to attract as many visitors as possible to their site. 

    Then they want those visitors to do something useful on their site, like purchase something, watch advertising, visit a lot of pages to see a lot of commercials, or click on adverts. It may simply be to convince the visitor to read your complete perspective on certain sites where expressing a strong opinion on a contentious issue is the purpose. 

    The direct economic value of a user clicking on a link to a third-party website that is not an ad might be difficult to perceive in any of these circumstances. 

    Finally, individuals employ links when they feel they are pointing a user to a high-quality online resource that will provide value to that person. 

    This adds value to the site that implemented the link since the user will have had a positive experience on their site because they linked the user to a helpful resource, and the user may return for future visits. This information is used by Google to help it evaluate which resources on the web are of the highest quality. 

    For example, if someone types in "create a rug," Google would likely return tens of thousands of sites that explore the subject.

    What criteria does Google use to determine which is the best, second best, and so on? 

    Even the most advanced AI systems are unable to make this conclusion just based on content analysis. Links allow Google to see what other people on the internet consider to be valuable resources, and they serve as an input to their algorithms for judging content quality. Not all connections, however, are valuable. 

    Ads, of course, are skewed due to the fact that they are paid for. Low-value links, as well as those containing any information or experience about the subject, are likely to be penalized. Furthermore, many sites continue to try to manipulate the link algorithm in order to get high ranks without really deserve them. 

    It's important to know what forms of conduct are unnatural and hence likely to be ignored or punished by Google, in addition to knowing why certain sites could naturally adopt connections to a third-party site. In the academic environment, for example, you cannot purchase the placement of a citation in someone else's research paper. 

    You don't barter for such placements ("I'll mention you in my paper if you mention me in yours"), and you surely wouldn't sneak references to your work into someone else's research paper without the writer's permission. 

    You wouldn't publish dozens or hundreds of badly written articles solely to get more references to your work in them, either. 

    You wouldn't upload your work to dozens or hundreds of sites set up as repositories for such papers if you knew no one would ever read it or if the repositories included a large number of fraudulent papers with which you didn't want to be linked. 

    In theory, you are unable to vote for yourself. All of these cases, of course, took place on the Internet and included connections. All of these techniques are in direct opposition to how search engines seek to utilize links, since they rely on connections that have been gained via merit. 

    This implies that search engines do not want you to buy links in order to manipulate their results. Of course, you may purchase ads—nothing there's wrong with that—but search engines prefer ad links with the nofollow property, which tells them not to count them.

    Furthermore, pure barter relationships are either undervalued or disregarded entirely. 

    From 2000 to 2005, it was common to send individuals emails offering to link to them in exchange for them linking to you, on the theory that this would assist with search engine results. Of course, these kinds of connections aren't true citations. 

    Links from user-generated content sites, such as social networking sites, will also be ignored by Google. 

    Anywhere where individuals may connect to themselves is a site that search engines will ignore or even penalize if they discover harmful behavior patterns. 

    Google spent a lot of time and money creating systems for identifying low-quality connections. For many years, it was a labor-intensive procedure. 

    However, with the first release of the Penguin algorithm on April 24, 2012, they made a major stride ahead. 

    Penguin marked the beginning of their practice of automatically recognizing low-quality connections and either rejecting them or imposing an algorithmic penalty on the sites that received them. 

    Until the release of Penguin 4.0 on September 23, 2016, Penguin ran independently from the main algorithm and only updated on a periodic basis. Penguin had been entirely assimilated into the main algorithm as of that date. 

    Google's algorithm was also altered on that day to concentrate entirely on finding low-quality connections and downgrading them to zero value. 

    Google's trust in the Penguin notion had risen to the point where penalizing these connections was no longer necessary. Google's web spam team, on the other hand, continues to manually evaluate link profiles for sites that are suspected of having a suspicious link profile and may levy penalties against them.

    In the Penalties portion of this article, we'll go through this in further detail. As a result, it's a good idea to know what kinds of connections Google doesn't like. 
     

    Links That Google Dislikes

     
    The following is a list of several sorts of links that Google may deem less useful, if not worthless at all:  
     

    Article directories 

     
     are a kind of article directory. These are websites where you may submit an article for publication with little or no editorial scrutiny. All you had to do was post an article and it may include links back to your site. The difficulty is that this is a type of self-voting, and discovering connections from these sites is rather simple for Google to detect.
    Many directories on the Internet exist only to collect money from as many sites as possible. The owner's primary aim is to collect as many listing fees as possible in these sorts of directories, which have little or no editorial scrutiny.  
     

    Links from nations where you don't conduct business 

     
     If your firm exclusively does business in Brazil, there's no purpose to have a lot of links from Poland or Russia. There isn't much you can do if someone choose to offer you connections that you didn't ask for, but there's no reason to participate in activities that would lead to you receiving links from such nations. Links from other countries that have a link in a different language Some SEO experts go out of their way to gain links from all over the place.  
     

    Comment spam. 

     
     Dropping links in comments on forums and blog articles was once a common method. Since Google added the nofollow property, this strategy has become much less lucrative, yet active spammers continue to pursue it. In reality, they deploy bots to leave automatic comments on blog articles and forums all over the Internet. They may leave 1 million or more comments in this manner, and even if only.001 percent of those links are not nofollowed, the spammers will still get 1,000 links.  
     

    Guest post spam

     
     refers to badly written guest articles that provide little value to visitors and were created just to get a link back to your own website.  
     

    Guest posts that have nothing to do with your site. 

     
     This is a sort of guest post spam in which the content created has nothing to do with your website. If you sell old automobiles, don't expect Google to think a guest article on lacrosse equipment with a link back to your site is valuable. 
     

    In-context guest post links 

     
     have no significance. Posts that contain links back to you in the body of the article are another kind of guest blogging that Google dislikes, especially if the links are keyword-rich and don't offer much value to the post itself.  
     

    Advertorials 


    This is a kind of guest post that is written in the style of an advertisement. Given the structure, it's quite probable that the website that posted it was swayed in some way. Focus on sites that don't allow these sorts of guest posts if you're going to use guest blogging as part of your approach. While the above four instances all include guest posts, Google generally frowns on any form of guest blogging done for the purpose of link building. This isn't to say you shouldn't guest post; nevertheless, your objective should be to encourage people to read your material rather than to acquire links.  
     

    Widgets.

     
     Creating helpful or fascinating tools (widgets) and enabling third-party websites to distribute them on their own sites has become a popular strategy. Normally, they included a link to the widget creator's website. In theory, there is nothing wrong with this approach if the material is extremely relevant; nonetheless, the strategy was overused by SEOs, causing Google to disregard many of these sorts of connections.  
     

    Infographics.

     
     This is another area that, although theoretically permissible, has been heavily exploited by SEOs. At this time, it's unclear what Google does with these links, so you should only produce infographics if they're really relevant, helpful, and (of course) correct.  
     

    Anchor text that is misleading.

     
     This is a more nuanced problem. Consider the case where a link's anchor text states "information about golf courses," yet the page to which the link is sent is about tennis rackets. This is not a pleasant user experience, and it is not something that search engines will like. # 
     

    Malware-infected sites 

     
     Obviously, Google looks to disregard these sorts of connections. Malware-infected websites are very detrimental to users, therefore any link from them is worthless and even dangerous. # 
     

    Footer links. 

     
     While there is nothing fundamentally incorrect with a link in the footer of someone's website, Google may devalue its value since these links are less likely to be clicked on or seen by people. Read Bill Slawski's essay "Google's Reasonable Surfer: How the Value of a Link May Differ Based on Link and Document Features and User Data" (http://bit.ly/reasonable surfer) for more information on this issue. 
     

    Unrelated links in a list.

     
     This might be an indication of a bought link. Assume you come across a link to your "Travel Australia" website among a list of links that also includes an online casino, a mortgage lead generation site, and a lottery ticket site. This does not seem to Google to be a positive thing. 
     

    Links from low-quality sites.

     
     The most valuable links are those that originate from extremely high-quality sites that demonstrate a high level of editorial control. Conversely, when quality declines, so does editorial control, and Google may stop counting these connections altogether. 
     

    News releases.

     
     It was once fashionable to send out a large number of press releases, each containing keyword-rich text links back to your website. Of course, this is a type of self-voting, and press releases should not be used to promote your site in this manner.  
     

    Make a list of websites to bookmark. 

     
    Delicious, Evernote, Diigo, and other great services for storing intriguing links for your personal use are just a few examples. However, since they are user-generated content sites, their links are nofollowed and have no effect on your site's rating. Not all of the sorts of connections listed above will necessarily result in a penalty for your site, but they are all instances of links that Google will most likely ignore. 

     

    Removing Low-Quality Backlinks 


    The first step in the link cleansing procedure is to get into the correct frame of mind. Consider how Google views your links when you analyze your backlink profile. 

    Here are some general guidelines for determining if a link is valuable:

    If Google and Bing didn't exist, would you want that link? 

    Would you happily display it to a potential client before she makes a purchase? Was the URL provided as a legitimate recommendation? 

    You may find yourself attempting to justify the usage of a link when you analyze your backlinks. 

    This is typically a solid indication if the connection isn't working.

    High-quality linkages don't need to be justified; their value is self-evident. 

    Recognizing the need to be thorough is another important component of this approach. It's terrifying to lose a lot of traffic, and it's normal to feel impatient. If your site has been hit with a manual link penalty, you'll be eager to submit your reconsideration request, but once you do, there's nothing you can do except wait. 

    If you don't do enough to eliminate harmful links, Google will reject your request for reconsideration, and you'll have to start again. If you submit a number of reconsideration requests without result, Google may give you a notice advising you to take a break. 

    Make a point of eliminating and disavowing connections as quickly as possible, and don't attempt to preserve a lot of minor ones. In the end, this nearly always speeds up the procedure. 

    Furthermore, those dubious links that you attempt to preserve on a regular basis aren't really benefiting you. With all of this in mind, you'll want to complete the procedure as swiftly as possible. 
     

    Data Sources for Link Cleaning. 

     
     In your site's Search Console account, Google displays a list of external links. 

    Because this list is prone to being incomplete, we suggest that you gather links from a variety of additional sources. Ahrefs (https://ahrefs.com/), Majestic SEO (https://www.majestic.com), SEMrush (https://www.semrush.com), Link Explorer (https://moz.com/link-explorer), and LinkResearchTools (https://www.linkresearchtools.com) are some of the greatest extra sources. 

    Each of these tools, like Search Console, only provides a limited list of the link. Because these software suppliers are tiny and the task of scanning the web as completely as Google is difficult, it should come as no surprise that they do not cover the whole web.

    Building a database using the combined data from all of these tools, on the other hand, will provide a more comprehensive list of linkages. 

    During a research of link tool suppliers, Perficient discovered that combining these data sources resulted in discovering twice as many links as the vendor with the biggest index of links (https://blogs.perficient.com/2021/01/26/study-who-has-the-largest-index-of-links/). 

    Of course, there will be a lot of overlap in what they display, so make sure the list is deduplicated. Even combining all of these sources, however, is insufficient. 

    In Search Console, Google only discloses a subset of the links it is aware of. The other link providers rely on their own company's crawls, and crawling the whole Web is a huge operation for which they simply do not have the resources.  
     

     Cleaning Links using Tools.

     
     There are technologies that may assist speed up the process of removing problematic connections by automating the process of detecting them. Remove'em (https://www.removeem.com/) and Link Detox (https://smart.linkresearchtools.com/new/link-detox) are two of the most popular. These tools may be able to assist you in identifying some of your faulty connections. 

    However, you should not depend only on these tools to complete your tasks. Each program has its unique methodology for detecting problematic connections, which may save you time when evaluating all of your links. 

    Keep in mind, however, that Google has spent over 15 years perfecting its algorithms for analyzing connections, and it is a major element of its business to do so efficiently, including identifying link spam. 

    Third-party technologies will fall short of Google's algorithm in terms of sophistication. 

    They can discover some of the problematic links, but not all of the ones that you'll need to fix. 

    You should evaluate all of the connections, not only the ones that are designated as dangerous, but also those that are just questionable or even harmless. Use your own judgment rather than relying on the tools to judge what is good or harmful for you. Disavow Links is a program that allows you to remove links from your website. 

    You may disavow links using a service provided by Google.


     (http://bit.ly/disavow_links). The Disavow Connections tool informs Google that you no longer want particular links to get PageRank (or any other advantage). This provides a strategy for reducing the harmful effects of poor links referring to your website. Manual Actions (Penalties) on Google 

    There are two ways to lose traffic: 

    1. Google algorithm adjustments and human measures. 
    2. Changes to algorithms are not punishments, and they do not entail any human intervention, while manual penalties do. 

    While the specifics of what causes Google to undertake a manual assessment of a website aren't always clear, manual reviews tend to be triggered in a variety of ways. Note that although an algorithmic ranking adjustment may occur in certain situations, these are not regarded "penalties" by Google. 

    The following is a list of the primary probable triggers:  

     

    Submit spam.

     
     Any user (even your rival) may report spam to Google (http://bit.ly/report webspam). While Google hasn't said how many of these complaints it gets on a daily basis, it's probable that they get a lot of them. Google reviews each report and undertakes a human assessment if it considers one trustworthy (it may use an algorithmic verifier to decide this). 
     

    Review initiated by an algorithm. 

     
     While Google has never confirmed this method, it's probable that algorithms are used to prompt a human evaluation of a website. The idea is that Google employs algorithms to discover huge numbers of sites with potentially harmful conduct, but not severe enough for Google to punish them algorithmically, therefore these sites are queued for human review. Custom algorithms might potentially be used by Google to flag sites for evaluation. # 
     

    Regular evaluations of search results. 

     
     Google has a big staff of employees that manually check search results in order to assess their quality. This project is mainly meant to offer feedback to Google's search quality team, which will be used to improve their algorithms. However, it's feasible that this method may be used to select specific places for additional investigation. When a review is initiated, a human reviewer looks at a set of criteria to see whether a penalty is warranted. Whatever the result of the investigation, it's probable that Google will preserve the notes in a database for future reference. Google is expected to preserve a record of all webmasters' past transgressions, whether or not they result in a penalty.  
     

    Google Penalties and Manual Actions.

     
     There are numerous different types of manual punishments. Thin content and link-related penalties are the most well-known forms of penalties, but you may also earn a number of additional punishments. The following sections go through some of the most prevalent forms of manual punishments. 

    Google has two important sites that will help you understand the various sorts of penalties and what they mean: 





    The content of these two pages, which outline the sorts of activities that lead Google to have problems about your site, is a crucial element of any SEO approach. 


    Here are some of the most typical penalties that websites can face: 


    Penalties for having insufficient material. 


    This penalty is applied to pages that, in Google's judgment, do not provide enough value to users. Unfortunately, when you obtain a penalty like this, Google doesn't provide you any information about what caused it. It does inform you that you are being penalized for having insufficient content, but the rest is up to you. 

    Thin-content penalties are triggered by four main factors: 

     

    Pages containing little or no valuable information. 

     
     Pages with very little content are possible causes for this penalty, as the name implies. This is particularly true if there are a lot of these pages or if there is a portion of the site where a substantial percentage of the pages are considered thin.

     

    Thin slicing. 

     
    This occurs when publishers create pages only for the purpose of attracting search traffic. These publishers often create pages for each possible search term a visitor may use, even if the content changes are minor or irrelevant. Publishers often mistakenly achieve this by auto-generating content pages depending on searches visitors type while utilizing the website's search feature. If you decide to implement anything like this, you'll need a thorough review process for weeding out these thin-slicing versions, as well as a single version of the page to concentrate on.  
     

    Doorway pages. 

     
     These are pages that seem to have been created only for the purpose of monetizing people who have arrived through search engines. These pages may be identified by the fact that they are frequently solitary pages with minimal follow-up material, and/or they are pages that are primarily produced for search engines rather than people. When a user lands on these sites, he or she has two options: purchase now or leave.  
     

    Inadequate integration with the rest of the site. 

     
     Another thing to check for is whether or not sections of your site are nicely integrated with the rest of it. 

    Is there a straightforward method for people to access these pages from the home page, the site's primary navigation, or at the very least a key portion of the site? 

    A thin-content penalty may be imposed if a piece of your site looks to be separated from the rest of your site. You must file a reconsideration request after you think you have remedied these concerns. 

    More information is available in the "Filing Reconsideration Requests" section below. After you've submitted your request, all you have to do now is wait for Google to respond. 

    Normally, this procedure takes two to three weeks. If you succeed, you're in excellent condition; all you have to do now is make sure you don't go overboard again in the future. 

    Otherwise, you'll have to go back to the drawing board to see what you may have overlooked. 

    Penalties for partial links. A partial link penalty is another potential manual punishment. As part of the warning you get from Google, this is commonly referred to as a "impacts links" penalty. 

    These penalties mean that one or a few of your pages have been marked for poor linking practices. Normally, this penalty has only a little impact on the ranks and traffic of those specific pages. 
     

    Link penalties that apply to the whole site. 

     
     Manual link penalties may be issued to the whole site. This typically suggests that more than a few sites are implicated, and it might even imply that the site's main page is affected. 

    The publisher's sitewide rankings are decreased as a result of this punishment. 

    As a result, the quantity of traffic lost is usually much more than with a partial link penalty.  
     



    Other Types of Manual Actions/Penalties.

       
     

    Cloaking and/or sneaky redirection.

     
     If Googlebot thinks you're presenting different versions of sites to Googlebot than you are to users, you'll receive this notice. 

    To troubleshoot this, obtain the page using Search Console's URL Inspector tool. Use the tool to compare two pages by loading the same page in a different browser window. If you don't have access to Search Console, the Mobile Friendly Test Tool is the next best thing . 

    If you see disparities, put in the time and effort to find out how to get rid of them. 

    You should also look for URLs that redirect people to pages that aren't what they expected to see—for example, if they click on anchor text expecting to read an article about a topic they're interested in but instead land on a spammy page trying to sell them something. 

    Conditional redirection, where people who come via Google search or a certain range of IP addresses are diverted to different sites than other users, are another possible cause of this issue. 
     

    Keyword stuffing and/or hidden text. 

     
     This alert appears if Google suspects you of cramming keywords into your sites to manipulate search results—for example, if you place information on a page with a white background and white text, which is invisible to humans but visible to search engines. Another strategy to send this message is to simply keep repeating your page's core keyword in the hopes of affecting search results. 
     

    Spam created by users. 

     
     This penalty is imposed on websites that accept user-generated content (UGC) but are deemed to be performing a poor job of quality control on that material. 

    It's fairly typical for spammers to target sites with user-generated material by submitting low-quality content with links back to their own sites. 

    Identifying and removing the spammy pages is a short-term solution. The longer-term solution is to set up a system for analyzing and removing spammy material before it enters your site in the first place.  
     

    Unnatural links from your site. 

     
     This means Google thinks you're selling links to other parties or engaging in link schemes to pass PageRank. The solution is simple: either delete or add a nofollow tag to any links on your site that seem to be sponsored links. 
     

    Master Security Issues Report

     
    MSIR is a document that lists all of the security issues that have been identified Google will notify you of the penalty by giving you a notice in Search Console and/or by displaying warnings in the search results that your site has been hacked (and is unsafe to visit). 

    Failure to keep up with upgrades to your content management system is the most prevalent source of this penalty (CMS). 

    Spammers use weaknesses in the CMS to alter your web pages, usually to include links to their own sites, but sometimes for more sinister goals like as gaining access to credit card data or other personally identifying information. To fix the issue, you'll need to figure out how your website was hacked. 

    If you don't have any technical personnel on staff, you may need to seek assistance in detecting and correcting the issue. Keep your CMS updated to the most recent version available to limit your risk in the future.  
     

    Pure spam.

     
     If Google feels your site is utilizing particularly aggressive spam methods, it will display this alert in Search Console. This may include things like automatically created nonsense or other approaches that don't seem to be aimed at adding value to people. If you get this notice, you should probably shut down the site and start again. 

    Spammy Freehosts.


    If a substantial fraction of the sites utilizing your hosting firm are spamming, Google may take action against all of the sites hosted there, even if your site is clean as a whistle. Make certain you're dealing with a reliable hosting firm. You must address the root of the complaints in order to solve any of these issues. Follow the method indicated in the section "Filing Reconsideration Requests" when you feel you have done so. 

     

    Diagnosing the Cause of a Traffic Loss

     
    Checking your analytics data to verify whether the decline is due to a loss of organic search engine traffic is the first step in determining the source of a traffic loss.

    If you have Google Analytics, Adobe Analytics, or another analytics package installed on your site, double-check your traffic sources and then isolate only the Google traffic to see if that's what's gone down. 

    Whether you've confirmed that the decline in Google organic search traffic is due to a Google penalty, the next step is to see if you've gotten a notification in Google Search Console stating that you've been punished. 

    If you've gotten one of these messages, you now know what the issue is and how to remedy it. It's not nice to have a problem, but understanding what you're up against is the first step toward healing. If you don't have such a message, you'll have to dig a little more to figure out what's wrong. 

    The next step is to pinpoint the precise date when your traffic began to decline. There are a number of programs on the market that may be used to determine whether there were any important Google changes on that particular day. 



    Here are eight tools that you may use to do this: 

    Mozcast Mozcast may be found at https://moz.com/mozcast/

    History of Google Algorithm Changes https://moz.com/google-algorithm-changes 

    RankRanger Rank Risk Index Tool https://www.rankranger.com/rank-risk-index/ 


    RankRanger Rank Risk Index Tool https://www.rankranger.com/rank-risk-index/ 

    'Grump' Rating on Accuranker https://www.accuranker.com/grump 

    Algoroo Advanced Web Rankings https://algoroo.com/ 


    Cognitive SEO Signals https://cognitiveseo.com/signals/ 

    If you haven't received a warning from Google Search Console and the date of your traffic loss does not coincide with a known Google algorithm change, determining how to recover is significantly more difficult since you don't know what caused the decline. On a regular basis, Google makes minor modifications to its algorithms. 

    These are minor tweaks rather than big improvements, according to it. Even these, though, might have a major influence on your site's traffic, whether favorable or bad. If they have a negative influence on you, such changes may be considerably more difficult to reverse. 

    Google makes daily modifications in part because it enables them to make tiny improvements on a regular basis while also running a range of tests to enhance the algorithm. 

    The breadth of these changes may sometimes reach a point where the industry notices them, and you can see lively conversations about what's going on on Twitter or in key search industry publications like Search Engine Land, Moz, Search Engine Journal, and others. 

    Google confirms some of these upgrades while others are not. Regardless, any of these may have a significant influence on your site's traffic. 
     

    Requesting Reconsideration for Manual Actions and Penalties. 

     
     Only fines are subject to reconsideration petitions. You won't be able to make a claim to compensate for traffic losses unless you have a manual penalty. 

    The second thing to keep in mind regarding your reconsideration request is that it will be reviewed by someone, and that person will most likely be reviewing a big number of them every day. 

    Complaining about what has occurred to your company or being combative with the reviewer will not assist your case. 



    The greatest strategy is to keep it brief and sweet: 



    1. Describe the situation in a few words. If at all feasible, provide some statistics. 

    2. Describe the problem. For example, if you were unaware of the regulations, just admit it and inform them that you have now learned them. Say that if you had a rogue SEO agency conduct shoddy job for you. 

    3. Describe how you resolved the issue: If you had a link penalty, tell them how many links you were able to delete. Tell them if you did anything unusual, such as deleting and/or disavowing all of your connections from the previous year. Statement acts like these may make a big difference and increase your chances of succeeding. 

    4. Make it clear that you plan to follow the Webmaster Guidelines in the future. Keep your reconsideration request brief, as previously said. Cover the important points briefly, and then submit it using the Search Console account linked with the penalized site. In reality, sending it from an account will result in a manual penalty.  
     

    Timeline for requesting a reconsideration.

     
     After you've submitted your request, you'll have to wait. The good news is that you should hear back within two to three weeks. Hopefully, your efforts will be fruitful! If not, you'll have to start again from the beginning to find out what you missed. # 
     
    Recovering from Traffic Losses That Weren't Caused by a Manual Action or Penalty.# 
     
     Reconsideration petitions are only available if you have been charged with a penalty. All you can do for the rest of the reasons for lost traffic is make the changes to your site that you think will help you recover and wait. To see what modifications you've made, Google needs to crawl your site again. Even if you've made enough adjustments, it might take Google many months to notice enough of the new or removed pages to tip the scales in your favor. 
     

    What if you don't make it back? 

     

    Unfortunately, if your results don't improve, it's likely that you haven't done enough to solve the problems that caused your traffic loss. 

    Don't rule out the potential that your development team made modifications that make it tough for Google to crawl your site. 

    Perhaps they changed the platform on which the site is built, utilized JavaScript in a manner that conceals material from Google, used Robots.txt to prevent content from being indexed, or any other technical problem. 

    If this isn't the case, you'll need to maintain investing on the aspects of your site that you believe are linked to the traffic decline, or that will help you raise the value of your site more widely. Take charge of the problem by committing to making your website one of the greatest on the Internet. This needs a lot of imagination and vision. 

    To be honest, it's not something that everyone can do without major time and financial effort. 

    One thing is certain: you can't afford to take shortcuts when it comes to mitigating the effects of Google traffic losses. If you've put in a lot of time and made a lot of changes, but you still have material that needs to be improved or other areas of the site that need to be improved, chances are you haven't done enough. You could find yourself four months later wishing you had persevered with your rehabilitation. 

    Furthermore, the Google algorithm is always changing. Even if you haven't seen a drop in traffic, Google's message is clear: sites that deliver exceptional content and excellent user experiences will be rewarded the most. 

    As a result, your best bet is to be enthusiastic about designing a site that does both. This is how you may increase your chances of recovering from traffic losses and avoiding future Google upgrades. 
     

    Ending Thoughts.

     
    Manual actions/penalties or algorithmic upgrades that result in traffic losses might have a big effect on your company. As a result, understanding Google's ever-changing Webmaster Guidelines (http://bit.ly/webmaster_best_practices ), creating appealing websites that suit the demands of the end user, and promoting these websites with legitimacy and longevity in mind is vital as a digital marketer.