আল্লাহ সর্বশক্তিমান ।


This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Saturday, May 20, 2017

Constantly Changing SEO


The Competitive Nature of Search Engines
Take a look at any search results page and you'll find the answer to why search marketing has a long, healthy life ahead.
Google Screenshot Yahoo Screenshot Bing Screenshot
There are, on average, ten positions on the search results page. The pages that fill those positions are ordered by rank. The higher your page is on the search results page, the better your click-through rate and ability to attract searchers. Results in positions 1, 2, and 3 receive much more traffic than results down the page, and considerably more than results on deeper pages. The fact that so much attention goes to so few listings means that there will always be a financial incentive for search engine rankings. No matter how search may change in the future, websites and businesses will compete with one another for this attention, and for the user traffic and brand visibility it provides.

Constantly Changing SEO

When search marketing began in the mid-1990s, manual submission, the meta keywords tag, and keyword stuffing were all regular parts of the tactics necessary to rank well. In 2004, link bombing with anchor text, buying hordes of links from automated blog comment spam injectors, and the construction of inter-linking farms of websites could all be leveraged for traffic. In 2011, social media marketing and vertical search inclusion are mainstream methods for conducting search engine optimization. The search engines have refined their algorithms along with this evolution, so many of the tactics that worked in 2004 can hurt your SEO today.
The future is uncertain, but in the world of search, change is a constant. For this reason, search marketing will continue to be a priority for those who wish to remain competitive on the web. Some have claimed that SEO is dead, or that SEO amounts to spam. As we see it, there's no need for a defense other than simple logic: websites compete for attention and placement in the search engines, and those with the knowledge and experience to improve their website's ranking will receive the benefits of increased traffic and visibility.



To Chapter 4 

Written by  and Moz Staf

An important aspect of SEO is making your website easy for both users and search engine robots to understand. Although search engines have become increasingly sophisticated, they still can't see and understand a web page the same way a human can. SEO helps the engines figure out what each page is about, and how it may be useful for users.

Why Search Engine Marketing is Necessary
An important aspect of SEO is making your website easy for both users and search engine robots to understand. Although search engines have become increasingly sophisticated, they still can't see and understand a web page the same way a human can. SEO helps the engines figure out what each page is about, and how it may be useful for users.

A Common Argument Against SEO

We frequently hear statements like this:
"No smart engineer would ever build a search engine that requires websites to follow certain rules or principles in order to be ranked or indexed. Anyone with half a brain would want a system that can crawl through any architecture, parse any amount of complex or imperfect code, and still find a way to return the most relevant results, not the ones that have been 'optimized' by unlicensed search marketing experts."

But Wait ...

Imagine you posted online a picture of your family dog. A human might describe it as "a black, medium-sized dog, looks like a Lab, playing fetch in the park." On the other hand, the best search engine in the world would struggle to understand the photo at anywhere near that level of sophistication. How do you make a search engine understand a photograph? Fortunately, SEO allows webmasters to provide clues that the engines can use to understand content. In fact, adding proper structure to your content is essential to SEO.
Understanding both the abilities and limitations of search engines allows you to properly build, format, and annotate your web content in a way that search engines can digest. Without SEO, a website can be invisible to search engines.

The Limits of Search Engine Technology

The major search engines all operate on the same principles, as explained in Chapter 1. Automated search bots crawl the web, follow links, and index content in massive databases. They accomplish this with dazzling artificial intelligence, but modern search technology is not all-powerful. There are numerous technical limitations that cause significant problems in both inclusion and rankings. We've listed the most common below:

Problems Crawling and Indexing

  • Online forms: Search engines aren't good at completing online forms (such as a login), and thus any content contained behind them may remain hidden.
  • Duplicate pages: Websites using a CMS (Content Management System) often create duplicate versions of the same page; this is a major problem for search engines looking for completely original content.
  • Blocked in the code: Errors in a website's crawling directives (robots.txt) may lead to blocking search engines entirely.
  • Poor link structures: If a website's link structure isn't understandable to the search engines, they may not reach all of a website's content; or, if it is crawled, the minimally-exposed content may be deemed unimportant by the engine's index.
  • Non-text Content: Although the engines are getting better at reading non-HTML text, content in rich media format is still difficult for search engines to parse. This includes text in Flash files, images, photos, video, audio, and plug-in content.

Problems Matching Queries to Content

  • Uncommon terms: Text that is not written in the common terms that people use to search. For example, writing about "food cooling units" when people actually search for "refrigerators."
  • Language and internationalization subtleties: For example, "color" vs. "colour." When in doubt, check what people are searching for and use exact matches in your content.
  • Incongruous location targeting: Targeting content in Polish when the majority of the people who would visit your website are from Japan.
  • Mixed contextual signals: For example, the title of your blog post is "Mexico's Best Coffee" but the post itself is about a vacation resort in Canada which happens to serve great coffee. These mixed messages send confusing signals to search engines.

Make sure your content gets seen

Getting the technical details of search engine-friendly web development correct is important, but once the basics are covered, you must also market your content. The engines by themselves have no formulas to gauge the quality of content on the web. Instead, search technology relies on the metrics of relevance and importance, and they measure those metrics by tracking what people do: what they discover, react, comment, and link to. So, you can’t just build a perfect website and write great content; you also have to get that content shared and talked about.


Read more »

Tuesday, May 16, 2017

The True Power of Inbound Marketing with SEO

The True Power of Inbound Marketing with SEO

Why should you invest time, effort, and resources on SEO? When looking at the broad picture of search engine usage, fascinating data is available from several studies. We've extracted those that are recent, relevant, and valuable, not only for understanding how users search, but to help present a compelling argument about the power of SEO.
A Broad Picture

Google leads the way in an October 2011 study by comScore:

  • Google led the U.S. core search market in April with 65.4 percent of the searches conducted, followed by Yahoo! with 17.2 percent, and Microsoft with 13.4 percent. (Microsoft powers Yahoo Search. In the real world, most webmasters see a much higher percentage of their traffic from Google than these numbers suggest.)
  • Americans alone conducted a staggering 20.3 billion searches in one month. Google accounted for 13.4 billion searches, followed by Yahoo! (3.3 billion), Microsoft (2.7 billion), Ask Network (518 million), and AOL LLC (277 million).
  • Total search powered by Google properties equaled 67.7 percent of all search queries, followed by Bing which powered 26.7 percent of all search.
  • view online

Billions spent on online marketing from an August 2011 Forrester report:

  • Online marketing costs will approach $77 billion in 2016.
  • This amount will represent 26% of all advertising budgets combined.
  • view online
  • view

Search is the new Yellow Pages from a Burke 2011 report:

  • 76% of respondents used search engines to find local business information vs. 24% who turned to print yellow pages.
  • 67% had used search engines in the past 30 days to find local information, and 23% responded that they had used online social networks as a local media source.
    view online


An August 2011 Pew Internet study revealed:

  • The percentage of Internet users who use search engines on a typical day has been steadily rising from about one-third of all users in 2002, to a new high of 59% of all adult Internet users.
  • With this increase, the number of those using a search engine on a typical day is pulling ever closer to the 61 percent of Internet users who use e-mail, arguably the Internet's all-time killer app, on a typical day.
    view online

StatCounter Global Stats reports the top 5 search engines sending traffic worldwide:

  • Google sends 90.62% of traffic.
  • Yahoo! sends 3.78% of traffic.
  • Bing sends 3.72% of traffic.
  • Ask Jeeves sends .36% of traffic.
  • Baidu sends .35% of traffic.
    view online

A 2011 study by Slingshot SEO reveals click-through rates for top rankings:

  • A #1 position in Google's search results receives 18.2% of all click-through traffic.
  • The second position receives 10.1%, the third 7.2%, the fourth 4.8%, and all others under 2%.
  • A #1 position in Bing's search results averages a 9.66% click-through rate.
  • The total average click-through rate for first ten results was 52.32% for Google and 26.32% for Bing.
  • view online

How People Interact With Search Engines

How People Interact with Search Engines
One of the most important elements to building an online marketing strategy around SEO is empathy for your audience. Once you grasp what your target market is looking for, you can more effectively reach and keep those users.
Robot Evolution
We like to say, "Build for users, not for search engines." There are three types of search queries people generally make:
  • "Do" Transactional Queries: I want to do something, such as buy a plane ticket or listen to a song.
  • "Know" Informational Queries: I need information, such as the name of a band or the best restaurant in New York City.
  • "Go" Navigation Queries: I want to go to a particular place on the Internet, such as Facebook or the homepage of the NFL.
When visitors type a query into a search box and land on your site, will they be satisfied with what they find? This is the primary question that search engines try to answer billions of times each day. The search engines' primary responsibility is to serve relevant results to their users. So ask yourself what your target customers are looking for and make sure your site delivers it to them.
It all starts with words typed into a small box.



How people use search engines has evolved over the years, but the primary principles of conducting a search remain largely unchanged. Most search processes go something like this:
  1. Experience the need for an answer, solution, or piece of information.
  1. Formulate that need in a string of words and phrases, also known as “the query.”
  1. Enter the query into a search engine.
  1. Browse through the results for a match.
  1. Click on a result.
  1. Scan for a solution, or a link to that solution.
  1. If unsatisfied, return to the search results and browse for another link or ...
  1. Perform a new search with refinements to the query.

Sunday, May 14, 2017

An Example Test We Performed

An Example Test We Performed

In our test, we started with the hypothesis that a link earlier (higher up) on a page carries more weight than a link lower down on the page. We tested this by creating a nonsense domain with a home page with links to three remote pages that all have the same nonsense word appearing exactly once on the page. After the search engines crawled the pages, we found that the page with the earliest link on the home page ranked first.

This process is useful, but is not alone in helping to educate search marketers.

In addition to this kind of testing, search marketers can also glean competitive intelligence about how the search engines work through patent applications made by the major engines to the United States Patent Office. Perhaps the most famous among these is the system that gave rise to Google in the Stanford dormitories during the late 1990s, PageRank, documented as Patent #6285999: "Method for node ranking in a linked database." The original paper on the subject – Anatomy of a Large-Scale Hypertextual Web Search Engine – has also been the subject of considerable study. But don't worry; you don't have to go back and take remedial calculus in order to practice SEO!
Through methods like patent analysis, experiments, and live testing, search marketers as a community have come to understand many of the basic operations of search engines and the critical components of creating websites and pages that earn high rankings and significant traffic.

How Search Engines Operate

How Search Engines Operate

Search engines have two major functions: crawling and building an index, and providing search users with a ranked list of the websites they've determined are the most relevant.
Crawling and Indexing

Imagine the World Wide Web as a network of stops in a big city subway system.

Each stop is a unique document (usually a web page, but sometimes a PDF, JPG, or other file). The search engines need a way to “crawl” the entire city and find all the stops along the way, so they use the best path available—links.

  1. Crawling and Indexing
    Crawling and indexing the billions of documents, pages, files, news, videos, and media on the World Wide Web.
  2. Providing Answers
    Providing answers to user queries, most frequently through lists of relevant pages that they've retrieved and ranked for relevancy.

The link structure of the web serves to bind all of the pages together.

Links allow the search engines' automated robots, called "crawlers" or "spiders," to reach the many billions of interconnected documents on the web.
Once the engines find these pages, they decipher the code from them and store selected pieces in massive databases, to be recalled later when needed for a search query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engine companies have constructed datacenters all over the world.
These monstrous storage facilities hold thousands of machines processing large quantities of information very quickly. When a person performs a search at any of the major engines, they demand results instantaneously; even a one- or two-second delay can cause dissatisfaction, so the engines work hard to provide answers as fast as possible.
Large Hard Drive
Providing Answers
Search engines are answer machines. When a person performs an online search, the search engine scours its corpus of billions of documents and does two things: first, it returns only those results that are relevant or useful to the searcher's query; second, it ranks those results according to the popularity of the websites serving the information. It is both relevance and popularity that the process of SEO is meant to influence.

How do search engines determine relevance and popularity?

To a search engine, relevance means more than finding a page with the right words. In the early days of the web, search engines didn’t go much further than this simplistic step, and search results were of limited value. Over the years, smart engineers have devised better ways to match results to searchers’ queries. Today, hundreds of factors influence relevance, and we’ll discuss the most important of these in this guide.
Search engines typically assume that the more popular a site, page, or document, the more valuable the information it contains must be. This assumption has proven fairly successful in terms of user satisfaction with search results.
Popularity and relevance aren’t determined manually. Instead, the engines employ mathematical equations (algorithms) to sort the wheat from the chaff (relevance), and then to rank the wheat in order of quality (popularity).
These algorithms often comprise hundreds of variables. In the search marketing field, we refer to them as “ranking factors.” Moz crafted a resource specifically on this subject: Search Engine Ranking Factors.
Keep ReadingSearch Engine Results
You can surmise that search engines believe that Ohio State is the most relevant and popular page for the query “Universities” while the page for Harvard is less relevant/popular.

How Do I Get Some Success Rolling In?

Or, "how search marketers succeed"

The complicated algorithms of search engines may seem impenetrable. Indeed, the engines themselves provide little insight into how to achieve better results or garner more traffic. What they do provide us about optimization and best practices is described below:
How Do I Get Success
Google

SEO INFORMATION FROM GOOGLE WEBMASTER GUIDELINES

Google recommends the following to get better rankings in their search engine:
  • Make pages primarily for users, not for search engines. Don't deceive your users or present different content to search engines than you display to users, a practice commonly referred to as "cloaking."
  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content. Make sure that your <title> elements and ALT attributes are descriptive and accurate.
  • Use keywords to create descriptive, human-friendly URLs. Provide one version of a URL to reach a document, using 301 redirects or the rel="canonical" attribute to address duplicate content.
Bing

SEO INFORMATION FROM BING WEBMASTER GUIDELINES

Bing engineers at Microsoft recommend the following to get better rankings in their search engine:
  • Ensure a clean, keyword rich URL structure is in place.
  • Make sure content is not buried inside rich media (Adobe Flash Player, JavaScript, Ajax) and verify that rich media doesn't hide links from crawlers.
  • Create keyword-rich content and match keywords to what users are searching for. Produce fresh content regularly.
  • Don’t put the text that you want indexed inside images. For example, if you want your company name or address to be indexed, make sure it is not displayed inside a company logo.

Have No Fear, Fellow Search Marketer!

In addition to this freely-given advice, over the 15+ years that web search has existed, search marketers have found methods to extract information about how the search engines rank pages. SEOs and marketers use that data to help their sites and their clients achieve better positioning.
Surprisingly, the engines support many of these efforts, though the public visibility is frequently low. Conferences on search marketing, such as the Search Marketing ExpoPubconSearch Engine StrategiesDistilled, and Moz’s own MozCon attract engineers and representatives from all of the major engines. Search representatives also assist webmasters by occasionally participating online in blogs, forums, and groups.
Tip of the Iceberg
Time for an Experiment
There is perhaps no greater tool available to webmasters researching the activities of the engines than the freedom to use the search engines themselves to perform experiments, test hypotheses, and form opinions. It is through this iterative—sometimes painstaking—process that a considerable amount of knowledge about the functions of the engines has been gleaned. Some of the experiments we’ve tried go something like this:
  1. Register a new website with nonsense keywords (e.g., ishkabibbell.com).
  2. Create multiple pages on that website, all targeting a similarly ludicrous term (e.g., yoogewgally).
  3. Make the pages as close to identical as possible, then alter one variable at a time, experimenting with placement of text, formatting, use of keywords, link structures, etc.
  4. Point links at the domain from indexed, well-crawled pages on other domains.              
  1. Record the rankings of the pages in search engines.
  2. Now make small alterations to the pages and assess their impact on search results to determine what factors might push a result up or down against its peers.
  3. Record any results that appear to be effective, and re-test them on other domains or with other terms. If several tests consistently return the same results, chances are you’ve discovered a pattern that is used by the search engines.                                                                                                                                                                                                                                     Read more »