Saturday, August 4, 2012

Want to compete with established websites on Google

Want to compete with established websites on Google


Established websites have a big advantage over new websites. Websites that were around several years ago had far fewer competitors and it was much easier for them to get high rankings in that environment.

Google trusts websites that have been around for a long time more. In addition, these websites had a lot of time to get good backlinks. If you have a new website, it will be difficult to compete with these established sites.

Fortunately, it is possible to compete with the big players if you have a new website. Here are three tips that will help you to compete with established websites:

Step 1: Target the right keywords

If you have a new website, it's nearly impossible to get high rankings for competitive keywords. Of course, you should include these keywords on your web pages but you should also optimize your web pages for less popular keywords that are not targeted by your competitors.

If your competitors target the keyword "pizza new york" then you might optimize one of your pages for "pizza new york queens". Even if your version of the keyword gets only a fraction of the original keyword, you will still get more visitors than before.

It is much better to rank #1 for a slightly less popular keyword (you will get some visitors) than to rank #60 for a very popular keyword (you will get no visitors).

Step 2: Target as many different keywords as possible

Repeat tip 1 with as many keywords as possible. For example, you might use "best pizza in queens", "recommended pizza restaurant queens", etc. Even if each keyword delivers only some visitors, you will get many visitors with all keywords combined.

Another advantage of getting high rankings for many of these long keywords is that your website becomes relevant to a topic. The more high rankings your website has for keywords that are relevant to a topic, the easier it gets to get high rankings for more competitive keywords.

Optimize different pages of your website for different keywords and optimize as many web pages as possible. People who search for long keywords are usually people who really care about the subject. These people are more likely to buy something and they might also link to your site.

Step 3: Create web pages to attract more backlinks

Some web pages on your website should generate revenue; others should help you to get backlinks. The more backlinks your website has, the easier it is to get high rankings.

Create web pages that solve the problems of the web searchers. Write "how to" articles about a very special topic. These web pages often get many backlinks. Of course, you should also actively build backlinks.

Getting high rankings for a new website is not difficult if you choose the right way. Do not attack the big players directly. Start with many related keywords and then proceed to the more competitive keywords.

Focus on both optimizing your web pages and getting good backlinks. Your website must have good optimized content and good backlinks if you want to get high rankings on Google.


Wednesday, July 18, 2012

Good Websites = Good Sales!


Cephalalgiawhat?

How to get more sales without it.

If your web pages are difficult to understand, you might not make as much money as you could with your website. A simple tool can help you to check the readability of your pages.


Difficult language = fewer sales
Many websites use technical language that is very difficult to understand for the average web surfer. Some webmasters also like long and complicated sentences.
Long and complicated sentences are not a sign of professionalism. They just show that the author of the sentences doesn't care about the readers.
The more complicated the text on your web pages, the more likely it is that a visitor will leave your website.
Website visitors are impatient people. If they have to work to understand your web pages, they will go away to find an easier site than yours.
How to check the readability of your web pages
The Flesch Reading Ease test is a United States governmental standard to determine how easy a text is to read. It measures the approximate level of education necessary to understand the web page content. Higher scores indicate that the text is easier to read, and lower numbers mark harder-to-read texts. A high Reading Easy Score means that a text is easy to understand. The Grade Level shows the number of years of education that are required to understand the text.
How to improve the readability of your pages
There are several things that you can to do improve the readability of your web pages:
  • Write short sentences.
  • Use many paragraphs.
  • Use headings to structure the content.
  • Use bullet lists.
  • Avoid complicated words. Don't say "cephalalgia" when you can also say "headache".
  • Use images on your pages.
Do not require too much work from your website visitors. The easier your website is to use, the more you will sell.
Make it easy to understand your web pages, make it easy to navigate them and make it easy to buy on your website.

Baidu Major Algorithm Updates


Baidu SEO Impacted By Major Algorithm Updates



Baidu has been implementing big algorithm updates since April, and the updates have had a significant impact on both indexation and rankings of thousands of websites. It seems that the updates are ongoing and it has been widely reported that two major updates happened in the past month, one on June 22, 2012 and the other on June 28, 2012. These updates have also been acknowledged by Baidu Engineer “Lee” from Baidu’s anti-spam team. This is quite a new development for Baidu SEO as previously Baidu generally did not acknowledge when changes had been made and it was left to the blogosphere to piece together what had happened.

So what’s the update about? According to Lee, the June 21, 2012 update implemented anti-spam measures targeting poor quality websites. Poor quality websites have long time appeared in Baidu search results and of course Baidu is concerned this will lead to a poor user experience for searchers. Lee defined poor quality websites as those that have very poor content and in particular “spun” content. The sole purpose of these websites is for manipulating the search results, a common practice in Baidu SEO.
After this June 22nd update, many webmasters complain that their sites lost significant rankings or disappeared altogether. Most of those complaining were SME company websites that do not update their content frequently. Lee did acknowledge that generally they can accurately identify and remove websites with poor content but “collateral damage” is hard to avoid. If this happens to your website, you could report it throughhttp://tousu.baidu.com/webmaster/suggest. Lee also said this is the start of a series of action; however, Baidu will in future make announcements on big updates through the recently launched Baidu Webmaster Tools. If you don’t have Baidu Webmaster Tools set up, now might be a good time.
This is really a wake-up call to all webmasters and especially those who attempt to create dozens of websites in order to manipulate the search results. Just like with Google in the past, Baidu SEOers may have to work a little harder than before to get results!
Reference: Source
Lee’s letter to webmasters is posted in Chinese on http://bbs.zhanzhang.baidu.com/thread-6533-1-1.html.

Saturday, July 14, 2012

How to get more clicks from rankings that you already have!

onpage seo tips

Although the meta description tag doesn't influence the rankings of your web pages much, it can have a major influence on the number of clicks that you get. By attracting more clicks, good meta descriptions can also have a positive influence on your rankings.

What are meta descriptions and why are they still important?
The meta description tag can be found in the HTML code of a web page:

Some people think that meta descriptions are not important anymore. However, Google Webmaster Tools shows an alert if a meta description is not unique. That's an indicator that the descriptions still matter.
  • Good meta descriptions can attract more clicks and a higher click-through rate.
  • A higher click-through rate can have a positive effect on the rankings of the web page.
In other words: good meta descriptions can deliver new customers to your website while contributing to your rankings at the same time.
How to optimize your meta descriptions
  • Meta descriptions should be unique: each page on your website should have its own meta description.
  • Meta descriptions should be relevant:boilerplate meta descriptions won't do your website good. The meta description should be relevant to the content of the page.
  • Include a call to action: think of what you want a user to do when they visit your page and then include a call to action in the description. The following verbs might help you: buy, download, shop, browse, download, etc.
  • Show your USP: tell your customers what makes you different and highlight this unique proposition in the meta description.
Web searchers have many options on the result page. Offer something special to attract their clicks. The meta description tag is not an immediate ranking factor. If you want to get listed on Google's first result page, optimize the content of your web pages and try to get good backlinks to your site.

Then optimize the meta descriptions of your web pages to make sure that you get as many visitors as possible through your rankings.

Monday, June 25, 2012

How to Optimize Press Releases


Press releases are a gold mine for attracting high quality links and building site credibility for an SEO campaign. If you already utilize press releases, you might as well get the most out of them to help build your online presence.
By optimizing a press release, you are gaining trusted back links, offering timely content, and providing a way for potential social sharing—everything a search engine uses to determine relevance to rank well in the results.

Where to Start

Optimizing a press release can be simple, and it starts with creating a SEO friendly title. Do this by including a keyword in your title. To maximize the potential of a press release, craft a title that is interesting and “clickable.” Distribution services for press releases also allow you to customize the URL for the release, so implement a keyword here as well.

Polishing the Press Release

After you have the title right, use keyword rich anchor text throughout the press release. Don’t use more than two keywords links in a single press release. It is also a good idea to use the most important link at the beginning of the press release.
When it comes to choosing the right keywords to implement, keep in mind that press releases are one of the best places to utilize branded keywords as opposed to blogs. Make sure the keywords and links flow naturally throughout the press release. At the end of your press release, it is recommended you include a keyword in your boiler plate.

Optimizing Afterward

Once your press release has been submitted, make sure to follow up with other SEO and social media strategies. Take the time to publish your press release on other outlets and social media platforms to maximize outreach. Your goal should be to get your press release picked up by as many reputable sources as possible so your linked keywords have more credibility to boost your page rankings.
After about two weeks, do an analysis for your press release to measure your efforts. If you follow these simple steps, you should start to get more out of your press releases.

Pandas, Penguins and Predictions


Pandas, Penguins and Predictions

Given the guarantee that Google will continually tweak their algorithms, the best way to handle potential impacts is to prepare. Being proactive and anticipating Google’s next move helps protect rankings and your online presence.
Here are some things we believe will become the central focus of future algorithm updates, how they will affect the industry, and suggestions for keeping up with it all.
1.    Over-Optimization: Google intentionally targets websites that try to ruin natural search engine results pages. If you’re trying to manipulate Google for better rankings, think twice. After the most recent Penguin update, plenty of sites paid the price in lowered rankings. It wouldn’t be far-fetched to assume more algorithms in the future will continue to punish overly-optimized websites that try to cheat the system. Play your cards right, or pay the ultimate price.

Sure, saying you should avoid over-optimization is easy, but there’s no exact formula that defines what constitutes too much SEO. However, we do have an idea on how to try to stay on the safer side. A good approach is to continually focus on best practices. This recommendation is not an earth-shattering idea, but we recommend spending time crafting quality content. Also, focus on visitor interactions and essential SEO elements. Don’t go overboard on tools and techniques that attempt to attract unnatural, quick rankings.

2.    Quality: A lot of websites that were hit by the most recent Penguin update were involved in a number of these issues:
o    Comment spam
o    Article marketing sites
o    Guest posting on questionable sites
o    Paid text links

The common theme is punishing low quality content. Google has preached that worthwhile content will naturally attract more links, and future algorithms will continue to encourage others to adopt this philosophy. For us, credibility and quality go hand-in-hand. We recommend building a credible business and website by balancing appropriate content development and promotion as well as link-building. Websites who balance optimization practices with a more user-oriented experience will likely be rewarded in the future.

3.    Engagement: User-experience and engagement are important. Because Google rewards sites that it believes users will find relevant and of value, it is a good guess to think the future of SEO could include high engagement as a factor for rankings. At present, there are no exact measurements to gauge engagement, but we all know nothing with SEO is certain.

As we mentioned above, creating a natural user experience, developing a relationship, and encouraging action could be even more crucial in the future. We suggest continuing to dive into the social aspect of SEO. Be sure to stay actively involved in social networks, comments, and reviews to encourage natural participation from others. By being active on social media platforms, you can expect more visitor engagement on your site.

As much as people fear Google updates, if you aren’t trying to take advantage of the system you shouldn’t stress too much. In the end, the same type of websites will be punished by Panda, Penguin, or any other animal in the future — those who try to take shortcuts.

Sunday, June 24, 2012

How to get high local rankings

Local search ranking factors to get high local rankings

40 local search marketers around the world participated in a survey to find the most important ranking factors for local search. If you want to get high rankings in Google's local search results the survey can help you.





Why is it important to get good local rankings?
For some keywords (for example "plumbers"), Google shows local results at the top of the search result list. These websites aren't listed at the top because they have good content or good backlinks.
They are listed at the top because they fit to the geographic area of the search query. If keywords that show local results are relevant to your business, it is important to be listed in the local search results.
The five most important factors for local results
According to the survey, the following five factors are the most important factors that influence the overall ranking of a website in Google's local results:
  1. The physical address in the city of search
  2. Proper category association
  3. The proximity of the address to centroid
  4. The domain authority of the website
  5. The quantity of structured citations on Internet yellow pages websites and data aggregators
It also helps to have the city and the state on the landing page.
The top five factors that have a negative influence on your local rankings
The following factors can have a negative influence on your local rankings:
  1. A mis-match of phone numbers across data ecosystems. For example, tracking phone numbers can negatively affect your local rankings.
  2. Multiple Google Place pages with the same phone number.
  3. Multiple Google Places pages with the same or similar business title and address.
  4. A mis-match of the address and/or the phone number on Google Places and the landing page.
  5. Including location keywords in categories.
If your website doesn't have a crawlable version of your name, address and phone number on the landing pages, Google might not also give your website high rankings.
Getting high rankings on Google is important if you want to get more customers and more sales. 

Wednesday, June 6, 2012

How to make your website relevant?

Last month, Google introduced the Knowledge Graph with the claim "things, not strings." Google says that the new algorithm "understands the world a bit more like people do." This has a major impact on how you have to optimize your web pages.



The context of the keyword is important
The same keyword can have multiple meanings and the same search can have multiple intentions. Do you mean Jordan the place, or Jordan the rockstar?
People are interested in knowing what music George Gershwin composed, whereas they’re less interested in what dialog Mariah Carey delivered, and more in what songs she sang.
In the past, search engines simply looked for the words that were used in the query. Now the meaning and the intention also plays a role.
You have to show Google that your website is relevant
If you want to get high rankings for the keyword "SEO Services India" then it is no longer enough to have that keyword on your web page. There are several things that you can do to improve the position of your page:
  • A single page should be as closely related to a single aspect of the keyword as possible. The more targeted your page is to that aspect, the more likely it is that it will be chosen for the results.
  • To make your website relevant to a topic, it helps if you have multiple pages that deal with different aspects of the topic. For example, one page on your site could be about the definition of Search Engine Optimization, other pages could be about why Indian SEO agencies, other about SEO packages, etc.
  • Use different words that describe the topic of your site. Use "SEO", "Organic Search", "Search Engine Optimization", "Search Marketing", etc. depending on what your website is about.
It's not enough that your web page is relevant to a keyword. It must also be relevant to a topic. The more pages you optimize, the more likely it is that your website will become relevant to your topic.
The right keywords are very important but it is also important that you use them correctly on your web pages. Create targeted and focused web pages and optimize as many pages as possible on your website. The more highly relevant web pages your website has, the more likely it is that your website will get high rankings on Google.

At the end webpage content writing is an art of expressing your views in an appealing way so that it not only make your visitors to stick to your website but convert them into your lead and customer.

Wednesday, May 16, 2012

Boost your web page titles

With all the hype about Google's latest algorithm updates, it is easy to forget about the basics that are needed to get high rankings. Before you work on the links to your site, your web page basics must be right.

For example, the title tag has a very big influence on the search engine positions of a web page. In addition, the title tag is very important because it is the first thing that people see in the search results. There are several things that you can do to optimize your web page titles.


Step 1: check the HTML code of your web pages
It's astonishing how many pages have more than one title tag in the code. If your pages have more than one title tag, they will confuse search engine robots. The tags might not be read at all or, even worse, search engines might interpret this as a spamming attempt.
Step 2: use your keywords in your web page titles
Use the most relevant keywords at the beginning of your web page titles. Many people only scan the beginning of the titles so make sure that the beginning of the title tag contains the right words.
Step 3: make the titles clear, predictable and irresistible
The readers of your web page titles should know what your web page is about and the title should lead to a page that meets the expectations of the searcher. The web page title is the first thing that people see in the search results. Make sure that the wording of your web page titles is appealing, emotional and irresistible.
Step 4: keep the titles short
Although there is no official limit, you should keep the length of your web page titles below 70 characters. The reason for that is that most search engines truncate the titles after 65-70 characters on the search result pages. Google takes 64 characters.
Step 5: remove unnecessary words
Words such as "homepage", "Index" and your company name usually don't help your search engine rankings. Remove these words from the title tags of your web pages. Your title tags will become more relevant to the actual keywords then.
Step 6: use unique title tags
Don't use the same title tags on more than one page. Each page should have a unique title tag that reflects the content of the web page. In general, optimize different pages of your website for different keywords. The more pages of your website you optimize the better.
The title tags of your web pages are only one factor that influences the position of your web pages in Google's search results.

Googler, John Mueller took a look and noticed an unusually high amount of title attributes used on the page, many with tons and tons of keywords and words in the title attribute. John said that the way he is using the title attribute can be seen as "sneaky" to Google's algorithms.

Tuesday, May 8, 2012

Google Announced 52 Search Updates

Google released their new monthly update on the changes they made to Google search engine over the past month. It is really great that Google does this and this time they shared 52 changes for April. Below are some important ones.

Penguin Related:

  • Anchors bug fix
  • Keyword stuffing classifier improvement
  • More authoritative results
  • Improvement in a freshness signal
  • No freshness boost for low-quality content
  • Improvements to how search terms are scored in ranking

Ranking Changes:
·         Improvement in a freshness signal. [launch codename "citron", project codename "Freshness"] This change is a minor improvement to one of the freshness signals which helps to better identify fresh documents.
·         No freshness boost for low-quality content. [launch codename "NoRot", project codename "Freshness"] We have modified a classifier we use to promote fresh content to exclude fresh content identified as particularly low-quality.
·         Smoother ranking changes for fresh results. [launch codename "sep", project codename "Freshness"] We want to help you find the freshest results, particularly for searches with important new web content, such as breaking news topics. We try to promote content that appears to be fresh. This change applies a more granular classifier, leading to more nuanced changes in ranking based on freshness.
·         Improvements to how search terms are scored in ranking. [launch codename "Bi02sw41"] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you're searching. This change improves the way those terms are scored.
·         Backend improvements in serving. [launch codename "Hedges", project codename "Benson"] We've rolled out some improvements to our serving systems making them less computationally expensive and massively simplifying code.
·         Keyword stuffing classifier improvement. [project codename "Spam"] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.
·         More authoritative results. We've tweaked a signal we use to surface more authoritative content.

Link Analysis Changes:
·         Anchors bug fix. [launch codename "Organochloride", project codename "Anchors"] This change fixed a bug related to our handling of anchors.

Index Updates:
·         Increase base index size by 15%. [project codename "Indexing"] The base search index is our main index for serving search results and every query that comes into Google is matched against this index. This change increases the number of documents served by that index by 15%. *Note: We're constantly tuning the size of our different indexes and changes may not always appear in these blog posts.
·         New index tier. [launch codename "cantina", project codename "Indexing"] We keep our index in "tiers" where different documents are indexed at different rates depending on how relevant they are likely to be to users. This month we introduced an additional indexing tier to support continued comprehensiveness in search results.

Search Listings:
·         More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains.
·         Categorize paginated documents. [launch codename "Xirtam3", project codename "CategorizePaginatedDocuments"] Sometimes, search results can be dominated by documents from a paginated series. This change helps surface more diverse results in such cases.
·         Country identification for webpages. [launch codename "sudoku"] Location is an important signal we use to surface content more relevant to a particular country. For a while we've had systems designed to detect when a website, subdomain, or directory is relevant to a set of countries. This change extends the granularity of those systems to the page level for sites that host user generated content, meaning that some pages on a particular site can be considered relevant to France, while others might be considered relevant to Spain.
·         Disable salience in snippets. [launch codename "DSS", project codename "Snippets"] This change updates our system for generating snippets to keep it consistent with other infrastructure improvements. It also simplifies and increases consistency in the snippet generation process.
·         More text from the beginning of the page in snippets. [launch codename "solar", project codename "Snippets"] This change makes it more likely we'll show text from the beginning of a page in snippets when that text is particularly relevant.
·         Tweak to trigger behavior for Instant Previews. This change narrows the trigger area for Instant Previews so that you won't see a preview until you hover and pause over the icon to the right of each search result. In the past the feature would trigger if you moused into a larger button area.
·         Better query interpretation. This launch helps us better interpret the likely intention of your search query as suggested by your last few searches.
·         News universal results serving improvements. [launch codename "inhale"] This change streamlines the serving of news results on Google by shifting to a more unified system architecture.
·         More efficient generation of alternative titles. [launch codename "HalfMarathon"] We use a variety of signals to generate titles in search results. This change makes the process more efficient, saving tremendous CPU resources without degrading quality.
·         More concise and/or informative titles. [launch codename "kebmo"] We look at a number of factors when deciding what to show for the title of a search result. This change means you'll find more informative titles and/or more concise titles with the same information.
·         "Sub-sitelinks" in expanded sitelinks. [launch codename "thanksgiving"] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.
·         Better ranking of expanded sitelinks. [project codename "Megasitelinks"] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.
·         Sitelinks data refresh. [launch codename "Saralee-76"] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We've recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
·         Less snippet duplication in expanded sitelinks. [project codename "Megasitelinks"] We've adopted a new technique to reduce duplication in the snippets of expanded sitelinks.

Local Changes:
·         More local sites from organizations. [project codename "ImpOrgMap2"] This change makes it more likely you'll find an organization website from your country (e.g. mexico.cnn.com for Mexico rather than cnn.com).
·         Improvements to local navigational searches. [launch codename "onebar-l"] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.
·         More comprehensive predictions for local queries. [project codename "Autocomplete"] This change improves the comprehensiveness of autocomplete predictions by expanding coverage for long-tail U.S. local search queries such as addresses or small businesses.

Images & Videos:
·         Improvements to SafeSearch for videos and images. [project codename "SafeSearch"] We've made improvements to our SafeSearch signals in videos and images mode, making it less likely you'll see adult content when you aren't looking for it.
·         Improved SafeSearch models. [launch codename "Squeezie", project codename "SafeSearch"] This change improves our classifier used to categorize pages for SafeSearch in 40+ languages.


And here are some other changes blogged about since last time:



Source

Wednesday, April 18, 2012

Received Google unnatural links message?





During the last few weeks, Google sent many webmaster notification messages about unnatural links. If you have received such a message, here's what you have to do to make sure that your website doesn't get penalized.



What is this Google unnatural links message?
Google has sent the following message to many webmasters:
"Dear site owner or webmaster of example.com.
We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.
We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team"
What will happen when you get such a message?
Webmasters who received such a message observed that many sites were penalized 3-4 weeks after the message. The penalty is for the keywords that are included in the unnatural links.
If most of the links that point to the website are unnatural links then the whole website might be penalized.
What are unnatural links?
There are several backlink types that Google finds unnatural:
  1. Public backlink networks: Google doesn't like fully automated or paid backlink networks. If you participate in a backlink network that can be joined by anyone (even for a fee) then it is very likely that Google has already penalized the network or that the network is a target for the near future.

    If you can find the backlink network, Google's engineers can find it, too.
  2. Private backlink networks: some SEO agencies have private backlink networks. Google probably has the technology to detect these networks without creating accounts.
  3. Paid sidebar links: if your website has too many backlinks from the sidebars of other websites, Google might find them unnatural.
  4. Over-optimized anchor text: if the links to your website all use exactly the same anchor text, it is likely that Google might reconsider the rankings of the linked pages.
  5. Fake forum and social media links: some tools create fake forum and social media site accounts to get backlinks to your website. Chances are that Google can detect that type of link.
What should you do?
If you used one of the methods above to get backlinks to your website, try to get rid of these links as quickly as possible.
The formula to high rankings on Google is very easy:
Good content + good backlinks + no spam = high rankings
Optimize the content of your web pages to make sure that Google and other search engines know what your website is about. Then get good backlinks to show search engines that your website can be trusted.
Google has become more aggressive regarding spammy backlinks. In the past, a website might get high rankings for several months until Google detected the site. Now it seems that the it takes a maximum of 3 months until Google detects the spammers.

Do not fall for SEO solutions that promise quick and easy backlinks. It doesn't make sense to get high rankings for 2-3 months just to get penalized after that time. If you are serious about your business, you have to use strategies that deliver high rankings that will stay, even if it takes longer to get these rankings in the short term.

How to create a content generation strategy



Tips and Tricks: Create a content generation strategy

Having a “good enough” content strategy is what puts you behind the competition. A content strategy needs to be more than high word counts and filling in white space. Your strategy needs to lay out an entire plan for the creation, publication, and control of usable and useful content.
The principles of a content strategy are centered on developing valuable website content to improve the user’s experience. Strategists need to build a structure for content to decide what content needs to be developed and decide why it should be published.
A superior content generation strategy will:
  1. Build loyalty - With fresh content on a regular basis, your website or blog will urge visitors to return frequently. Consistency can help build industry credibility.
  2. Appeal to search engines - Consistent and well-organized content can help you secure higher rankings in the search engines because it helps your website remain relevant.
  3. Generate new traffic - Your content generation strategy helps you keep your site up-to-date and timely. As such, your website will spark more interest and bring in additional visitors.
Before you can create a content generation strategy you need to have your keyword research completed. Your keyword research will help you maintain a focus for your content generation strategy.
Creating a content generation strategy begins with brainstorming. Use your keyword research as a structure to appease the search engines, but find topics surrounding the keywords to be relevant to readers.
Once you have brainstormed topics and determined what you to write, organizing the information is next. From an SEO perspective, organizing your information prevents you from duplicating content.
You can organize your content by putting different topics in categories and lists. By mapping out your content, your strategy helps solidify quality content that will engage the right audience. Decide what type of content works best including blog content, eBook content, white papers, or even webinar content. It also helps to build a style guide for your site so everyone who will be contributing content will be able to work within your set strategy and structure.
By combining critical thinking with brainstorming and keeping yourself organized, you can maximize your content generation strategy by making sure to hit all the SEO basics such as creating links and including images, alt text, videos, and audio.