Google - Supplemental Index

The Supplemental index, an index `additional 'can be found in some way with the Google Sandbox. If a Web page or pages in their sub Supplemental index, the effects on the ranking of similar.

Websites, which only appear in the Supplemental index, compared to the normal index listed in pages less frequently by the crawlers, or the Googlebot visited. In addition, the pages for searches disadvantage. It is often the case that only when a particular query inadequate or only a few results, pages from the Supplemental Index added. Provides a query enough hits, only the websites from the normal index.

The reasons why a page in the supplemental index on are diverse. Also here are rather new pages, and particularly those affected, which have no or very few external links. Duplicate content or the same titles and descriptions of many sub-sites may also cause pages to be. For a website that only in the Supplemental Index is back in the normal index is listed, you often only one or the reasons for the Abstrafung eliminated. Are the causes eliminated, it usually takes not very long until a site back to normal or gerankt listed...

Google - Sandbox

Already mentioned several times, is precisely at this point in the filter or the `phenomenon 'the Google Sandbox received.

Since the beginning of 2004 haunts ` 'this is not official, Not confirmed by Google filters through the ranks of the website operator. Meanwhile, it is almost certain that these are mainly new and highly relevant websites effect really exists.

To avoid these websites within the shortest possible time with important keywords at the top of the results listed, or spam pages excluded from the rankings, seems Google a filter or an algorithm to have developed, which look for a new site for a short Boost certain time in results below anstellt important keywords. Because the exact algorithm is not known, it is also difficult to say what conditions or conditions, a site must meet so that the filter picks. Especially sides often seem to be affected, relatively quickly, or many strong backlinks have received thus websites with a link acting unnaturally construction. On the other hand, there are also examples where through trusted websites or strong links from the sandbox again dismissed or were listed at the top.

The external link seems therefore a major role within the algorithm of Google sandbox to play. Another element beeinflussendes likely Hauptkeywörter a website. It indicates that mainly affected sites are important to the highly competitive, and ` 'keywords optimized. Others, e.g. Keyword combinations or from the competition hardly used terms optimized websites are against the rule rarely affected. The age of a website, and the relevance of the optimized keywords are possible reasons why a site of the sandbox effect is concerned. Does the filter at one side, this suddenly only partially under `obscure 'keyword combinations. Among the main keywords, it is usually far behind listed.

How long it takes to this effect is absent, and the affected site listed is back to normal, it is also difficult to say. It is believed that a website for 3 to a maximum of 12 months devalued. The average length of stay is expected in 3-6 months. Experts involved in the existence of a special filter doubt go rather assume that the effects described parts of the normal Google algorithm. This is underpinned by the fact that both the conditions, the length of stay, as well as the criteria under which a website again dismissed from the sandbox, is not clearly and strongly vary.

To dismiss from the sandbox to be rich sometimes even a few strong and trustworthy referrals from other websites. It has also been shown that websites, which in the initial phase with less than keywordspezifischen text links with buttons or banner links were less affected. Left, which only include the domain name, also had no major negative impact.

To the Google sandbox to escape, your site just in the early stages with only a few possible links to trusted references. Moreover, it is advisable to more banners or buttons for the link to use. In addition, you should first rather keyword combinations or weak disputed Single optimize keywords, as in a few highly competitive keywords optimized web pages rather abgestraft...

Google - Filters

Google lists new and the site is represented in the index is not without just before selecting. The search algorithm of each search engine includes countless filters or criteria by which Web sites evaluated and sorted. Criminals, minors or otherwise objectionable content is usually filtered, and the corresponding Web sites are deleted from the index. Furthermore, e.g. On the personalized search, the current behavior of the seekers according to new or other content added.

There are a few filters, especially when optimized for search engines websites seem to grasp. Below are some of these filters and the effects of these are briefly explained ...

Google - algorithm update

Mostly greater impact on the positions in results, changes within the search engines algorithm. There are situations where the new assessment criteria in the algorithm, or existing factors altered or removed.

Because repeatedly emerging manipulation techniques, changes in the algorithms of other search engines or completely newly developed search techniques, the Google algorithm at irregular intervals changed or adjusted...

Google - BL update

Also the number of external links or the back of a Web page links are continuously updated. Such changes are generally within a few days in the evaluation of the site. In larger intervals, approximately every 2-3 months, but it comes to an export of the already widely considered before the data links in the index. The external links a Web page can then query the link: www.ihredomain.de queried.

Also this update, or export, has no major impact on the ranking of a Web page. There are, however, some other ranking factors or criteria, which appear on Google Backlinks involved. This flow this indirectly, especially with other search engines...

Google - PR update

As already described, the PageRank of a woman page continuously updated. Approximately every three months, but it comes to an update of the displayed PageRanks. It can therefore up to three months, until your side for the first time a PR own shows.

The PR updates are not all up to this date set in external links. The date until which existing links back in order, is usually weeks before. The actual update has no or only a very small effect on the position within the result lists.

The PageRank of a web page is, however, for many website operators to a kind of `status symbol '. It is the visible result for all `successful 'search engine optimization and reflects the` value' of a page. For a successful link building, in particular for the exchange of links is displayed PR quite important...

Google - Updates

If you optimize your website, modify or extend the results or changes are not immediately taken into or displayed. Firstly, the Googlebot your website and the changes noted. Depending on the weight or external linking your website, it can be easily several days. It can sometimes turn some time before this in the evaluation of influence and example The ranking of your page.

Most of the changes are already within the shortest possible time. Google updates the information contained in the index continuously. The so-called `Google Dance ', i.e. In a major ranking update intervals takes place, there is no more. The PageRank, or the Back links are updated regularly. Did e.g. Her website because of the new information, some referrals from other websites, it may be that your site is already within a few days in the search results rise.

The same goes for the PageRank. The current ranking of your site is always the `current 'PageRank of the page. Did your website displayed a PR of 3, but a predicted, `future 'PageRank of 5, the latter is in the assessment of your site. The displayed PageRank is about every three months in a so-called PR-Update.

With this, there are some more updates, after which it changes in the list of search results can ...

Google - alternatives

The PageRank algorithm, the number and strength of incoming links a Web page that will always be an important factor in assessing the relevance or importance of a site will remain as one of the most important links characteristics of the World Wide Web. The PageRank, there is no direct reference to the actual quality of a Web page. Because this ultimately decisive for the relevance is running on alternative assessment procedures worked, some of which will be explained shortly ...

The Hilltop algorithm: With the help of the Hilltop algorithm are due to their websites relevance to specific search words or sorted. First, the so-called experts pages on a particular topic or a particular search word. These are pages that many independent websites on a particular topic. Thereafter, the so-called authority pages. Authority sites are sites over at least two experts pages. The Hilltop algorithm was developed in Toronto. 2003 Google bought the patent on the algorithm. From Google defined authority pages you see, inter alia, that their entries in the results lists of relevant keywords in the first place, and besides the usual page description further information, eg The links of the main sub-pages.

Hubs and Authorities: This procedure, also known under the name `HITS known, defines the value of a page because of its hub status and authority. The Hubeigenschaft rated the number and quality of the outgoing links of a page, the authority of a Web site is by its external links. This, inter alia, by Jon Kleinberg developed algorithm is generally important junctions within the link structure of the Internet.

Trust rank: The Trust rank algorithm, Google also patented, has been primarily developed to detect spam pages. These are first few Autoritäts or independent experts and trusted sites. One example is Wikipedia or DMOZ. Then all outgoing links pursued, with the trust of the rank-and Autoritäts experts pages, similar to the distribution of PageRanks, on the linked pages and transmit their external links. Because spam pages in the rule hardly be linked, this can be relatively easy to filter out, and according to devalue.

In addition, there are many other approaches, the search engines to improve results or outcomes unrelevante to filter. In general, all procedures of content, the actual quality of a web page in the foreground. Since it is based on various factors and always new techniques of manipulation is very difficult, that only automates, many approaches, similar to the Trust Rank, by manually audited websites or fixed points.

Editorial-run websites, directories and assessment systems such as the Social-Bookmarking will in the future become more and more important, and the evaluation procedures of the search engines add ...

Google - PageRank advantages and disadvantages

By PageRank, Larry Page and Sergey Brin an external, ie One at first glance difficult beeinfluss or manipulable factor in the process of evaluating websites. The basic idea that a Web site is rated higher, and if you have many also strong links from other sites get sounds plausible, and was one of the main reasons for the subsequent success of the search engine Google.

The introduction of the PageRanks were the results of Google usually better than those of the other search engines. Websites, without a relevant content to offer, for example, Spam using keyword search for the other services listed were high, filtered Google, the pages because relatively few external links which could. The PageRank algorithm was therefore a long time as a `milestone 'in the development of a search engine, which really relevant results and is difficult to manipulate.

Unfortunately, this `milestone 'now more likely to` a stone in the stomach'. Because PageRank from the beginning an important, if not the most important ranking factor, and also the algorithm by most other search engines in a similar manner, once had links to a high economic value. They were sold, rented and exchanged. Suddenly, there were countless manufacturers and / or services, which focuses exclusively on the marketing of specialized links. It also tried many website operators as many links as possible to collect. Before never visited score books or forums were suddenly flooded with link entries.

It quickly became clear that the PageRank one of the easiest to manipulative factors. Who had the opportunity, time and especially money to invest, the ranking could severely affect its website. Suddenly decided more and more funding on the order in results of most search engines. Unfortunately, it has until now only slightly changed. The link trade is still one of the fastest growing markets on the World Wide Web, although Google and the other search engines are trying, again and again against them.

Thus, the `attribute rel = nofollow '. A versehender attribute this link by the crawlers of search engines were not evaluated, ie The PR of the page is not linked party. Above all, in forums and blogs is the attribute meanwhile purposely used to avoid spam entries. Also known link services provider or regular abgestraft by example Their websites removed from the index, or all at the website containing links will be devalued.

Unfortunately, all these measures often only a `drop in the ocean '. Although the importance of the PageRanks for the ranking of a website has greatly decreased, the link is flourishing trade continues ...

Google - PageRank algorithm

Since the precise mathematical representation or statement of the algorithm would be too large to be here only once, the principle of Formula briefly explained. According to the PageRank algorithm, each site has a weight, the so-called PageRank. This is in a scale of 1-10, with 10 being the highest value and the highest weighting.

The weight, so the PageRank of a page is higher, the more pages, with a private high PageRank, on the page. A Web page is replaced by a link of another site a fraction of the weight of this page, also a part of the PR. How big part of this is exactly in the formula. To illustrate this graphically, serve the following simple examples:

To keep your website a PR of 3 receives must theoretically 101 pages with a PR of 2, or about 3 pages with a PR of 4 on your page. If you have a PageRank of 5 want to achieve, you need 3055 PR 2 link or 101 PR 4 links. For a PageRank of 3 extends also have a PR 5 link and a PR of 5 ranges already a link from a website with a page rank of 7.

The higher the PageRank of the linked site, the higher the PR of the linked page. The rise in the PageRanks is not uniform, but rather `erratic '. To step in the PR-scale increase, simplified, more and stronger links necessary.

The PageRank of a web page can be done with the help of Google Toolbar identified. It shows the exact PR of each page on the user is currently working on.

The displayed PageRank is about every three months in a so-called PR-Update. It may also happen that your website, although many other pages on this point, no PR. This is only indirectly the case. The external links and the effect of the links goes after a very short time in the ranking of your website with. The PR update is merely an update of the `visible 'Pageranks ...

Google - PageRank

The PageRank, or on the PageRank based algorithm was developed by the founders of the Google search engine. Larry Page and Sergey Brin, were still students at Stanford University, had tried a new method for assessing the relevance of individual websites.

The PageRank Algorhitmus rated websites due to the number and the quality and strength of incoming links.

Basis of this process is the theory that the relevance of a page will increase if other sites on this page. The stronger the weight of the respective links, and the larger the number, the higher the relevance or the PageRank of the linked page.

Put simply defines the PageRank the probability that a site is found on the Internet ...

Google - Reinclusion Request

If your Web page from the Google index has been removed, you have the option of a resumption or an application Reinclusion Request. Before this record, but you should find the reason for the removal of your site removed. Often, there are even more reasons which have led to disqualification. Because your website may be examined manually, you have to really be sure that with the filing of the application in each point to the guidelines.

The entry itself should in any case on the support forum. The exact address or the exact link to the section `Important addresses / links'. It makes no sense, the entry of a non dedicated e-mail address to send because Google thousands of e-mails daily with a variety of issues and concerns.

It is important that you are in the subject `Reinclusion Request '. Only then can Google your mail properly assign or edit.

In the text of the message you need to briefly explain that, and above all, why your site removed from the index. Explain, then, that the possible causes eliminated, and your site is back to the guidelines. Try to act friendly and indicate that you learned from this mistake and not once will attempt to circumvent the guidelines. It is better if you stay honest. To write, you would not be blamed for the cause of the banishment, or to invent stories, is not advisable.

In general, it takes quite a while to get your application processed. Have you, even if it's difficult, a little patience. A repeated submission or questions should be avoided. Until your side may again in the index, you can easily weeks, or even months. The chances that your page ever again appear, remain low.

Other possibilities, your page in the index, there is not. You do not need to try your site under a different domain, another server, or a different IP address to be published. Google will quickly find that it is the same site, and they are not in the index ...

Google - banishment from the index

It can also happen that a Web page from the Google index and `banned '. In short, this is the worst thing you as the operator of a site can happen. The vast majority of visitors receive pages in the rule of the search engine Google. If your site does not show up in search results, decreasing the number of visitors significantly.

Google and other search engines, however, not easy to remove without reason a page from its index. Google e.g. Have certain policies in which you can contribute to the creation and optimization of a site should hold. If the structure or content of your website against these guidelines, it may, without prior notice deleted from the index.

In addition to the guidelines is not appropriate content are `hidden text ', keyword spam, Duplicate Content and additional optimizations or unauthorized tampering frequent reasons for banishment from the index.

Because the total number of websites is not individually to determine whether or to what extent your website violates the guidelines. Your page through when indexing or crawling at specific programs or filters. If a gross violation found your page from the index. Another consequence of a breach of the guidelines is the so-called `Slow Death ', the` slow death'. This page is not your disappear abruptly, but slowly, page by page from the index. This type of sanction is relatively common and may have been observed on Web sites, several, such as Under multiple domains on the Internet to find, ie Duplicate for content. Although you can e.g. With the help of Google Analytics determine why your website has been removed, but the chance that they returned to the Google index is rather low.

They should in any case to the guidelines of the search engines. Not just because they allow you to a possible removal from the index avoided, but also because of any breach or attempted manipulation significant impact on the ranking of your site can have. Once banned, it remains your only option, the reason for the distance to eliminate, and a Reinclusion request, a request for resumption ...

Google - inclusion in the index

As mentioned in the chapter `Indexierbarkeit 'described, there are two ways to make your page indexed, or in the index of the search engine Google. In the section `Important Addresses / Links' to find the Internet address you register your new website. Once logged in, the crawler, in this case Googlebot, sooner or later your website and in the index.

In general, it takes a few days for your side for the first time in the search results appear. Whether your website has been indexed, you can use a so-called `Site query 'notice. To enter simply `site: www.ihredomain.de 'in the search box of a search engine. You will then receive a list of the already existing pages in the index of your Internet presence.

Another way to index your site, is the setting of some external links. Since the Googlebot follows every link, it is so relatively quickly to your website and they encounter. This alternative has the advantage that your site faster, sometimes within a few hours, indexed. In addition, Google ranks the relevance and authority of your site by the external links now slightly higher. The process until all your web pages in the search engine index, can also be accelerated.

Generally it is the case that your first home is indexed. Only later after the sub-pages. It can happen at a site query pages, and at a later query again neither side apparently exists in the index. The latter is not the case. Google has more than one data center to which the queries are forwarded. Depending on which data center you just query, it may be that your site is not listed there. It takes some time for the sides to display all data centers.

The more external links on a webpage, the faster and extensive will it be indexed. Once in the index, your page will be periodically visited and after changes or new content searched. If your site changes, e.g. New page titles or descriptions inserted, it may be very long before it in the search results.

They should therefore working on your side, and this time only sign up if you are sure that you really no fundamental changes have to carry out more ...

Offpage Optimization - Page Views

As in the chapter `visitor behavior 'explained, is the number of visitors or page views for the ranking of a Web page is important.

In principle, this means also that the location of a web page in the search results of a specific search for the listing on other search engines is important. If your page e.g. At Yahoo is very well placed, it will, in most cases, sooner or later, even if the other search engines. The algorithms of many search engines are very similar, some even exchanged data among themselves. Thus, a website that result from a particular search engine receives many visitors, even from other search engines good. Furthermore, if the sub-pages of a website many visitors, shows the search engines that the page interesting and relevant information, and also very well linked...

Offpage optimization - Continuity

Also, the continuity or the gradual held on or development of a particular site and its link building is another ranking factor.

A website, which is regularly updated, upgraded and expanded, will be held at the search engines more attention than a Web page that once published, barely changed. The same is true for the link up. If a website regularly receives new backlinks, this indicates that the site is interesting, and regular new information to offer. If a Web page right at the beginning, on the other hand many Backlinks gets later, but only a few or even none, the more can be concluded that the links `artificially 'has been set up.

Again, it is clear that for the building and also for the optimization of a website a lot of time, work and patience needed...

Offpage optimization - Visitor Behavior

It is often debated whether the visitor behavior for the ranking of a Web page is important. Meanwhile, however, there should be no question that both the number and the duration of stay, or `surfing habits' of the visitors for some search engines to a large extent with the assessment of a page development.

Through the evaluation of so-called log files can be accurately determine how many visitors had a page, how long each visitor to the site was using which link it came from and where it exited the site. Even with the help of many toolbars, it is possible to the search engines, to such information. In particular, the duration of each visitor is to draw conclusions about the quality of the page. If the site almost everyone after a short time left, the search engines assume that the page no interesting information. If the visitors long linger on the site, and many individual pages, however, shows that the page is interesting and relevant information to offer.

There is also possible to determine what keywords each visitor to the site today is, can the relevance of the site in relation to each keyword. If the search does not end for the search word relevant information, the site will soon leave. It is quite possible that some search engines including the ranking of a Web page at the various search words and evaluate.

Always try to get the seekers, which your page at a particular keyword has found the information to offer, which it is expected. In addition, you should make sure that the home page of your Internet presence and inviting übersichtlicht is built. If the visitor immediately addressed, it will be longer on the left side and eventually come back ...

Offpage Optimization - Alexa rank

Alexa Internet is a subsidiary of amazon.com. With the help of many programs, e.g. In Windows XP preloaded Alexa toolbar, the user can various services, similar to those of a search engine, to complete. The toolbar shows among other things, the so-called Alexa rank of a Web page.

It assesses the quality of a side because of their traffic, their visitors or behave. The program is more than controversial, as the Toolbar also own surfing habits and determined to Alexa Internet. Nevertheless, or perhaps because of it, plays a role Alexa rank for the ranking of your website. The higher your website in the ranking rises, the better you from the search engines...

Offpage Optimization - Ranking Other factors

Just as with the internal factors, there are also the external factors ranking many more criteria for the listing of the results of search engines to be taken into account. Generally, there are at the moment still more internal factors, the trend has been shown that in the future external, ie Less influenced or manipulierbare ranking criteria play an increasingly important role. Even at this point should therefore more important external factors are briefly discussed ...

Offpage optimization - Link Exchange

It is often very difficult to link exchange partner for a site to win their PageRank still at 0, 1 or 2. Your page should be at least a PR of 3 or 4, before you begin to link exchange requests.

Although by no means the only criterion in the assessment of links, many website operators first on the PageRank of the page of the potential link exchange partner. If your website does not, or a very low PR, the requests are usually denied. A PR of 3 it can be relatively easily, as already described, for example, Some links from Webkatalogen and directories. If your own page PageRank of 3 or even 4 of, you should have enough time to take appropriate link partners. Wide spreads are so-called link exchanges or special forums, in which one specifically looking for suitable partners. As in the previous chapter shows evaluate the search engines apparently purchased, rented and moved links under circumstances. From specifically on the topic-oriented services link exchange you should refrain from, as these providers most search engines are already known.

Two simple ways to link appropriate partner to be found, the search for the relevant page for your keywords in some search engines and directories, or the backlink analysis of the pages of your competitors. If your main keywords in several search engines, you will receive a list of potential link partners. Perform a backlink analysis of some competition sites, you can see what websites already link to your competitors. Again, this is mostly because of the relevance of topics potential partners for your own website.

The actual selection and assessment of the websites or the links, you should again have the chapter `link buying / Link Renting 'listed criteria for assistance. PageRank, the issues of authority and relevance of a website were here only once on the verge mentioned.

Have you ultimately for some websites that allow you to link a partnership could imagine, decided, it is now a fact, the operators of the respective websites to contact. If you have an e-mail, you should try to make the current site operators and to him personally and accurately explain briefly why you chose his side, and a link partnership with him just very well might imagine. Impersonal, standardized, or even special programs sent e-mails usually have a high rejection rate. Try friendly but also to act competently. To briefly on the issue or on the site of a potential partner, you explain what's on the side particularly like. Details, such as the PageRank of the page or the number of back links you should avoid. Either knows the respective website operator exactly to the quality of his side, or know your contact with these terms of this.

After your future exchange partners have become united, you must first discuss some important points. Point out that the links back to your site regularly. There is unfortunately always Website operators who after some time links remove, alter or move. Explain also that the position is an important criterion in the assessment of links. Avoid Footer links or references to specific link or barter sites. Another point of discussion is the nature of the link. The link must always direct, i.e. Without diversion to your page. Since the search engines, like I said, swapped links devalue, you should, if possible, a mutual or reciprocal link.

Simple link structures are usually detected immediately and possibly by some search engines negatively evaluated. Unfortunately, this type of bypass linking only if you have more pages. Then you can find a link to the page of your partner and put this refers to another of your sites. Experts are still arguing about whether a reciprocal link really hurts or less positive effect on the ranking of the respective sides, as a `unilateral 'link. The fact is, in any case, that reciprocal links major influence on the position in the results of search engines, and many moved links, such as not to be identified.

The targeted link building through the exchange of equivalent themenrelevanten links and is therefore an appropriate way to the ranking of your website to improve considerably ...

Offpage Optimization - link buying and renting link

Many inveterate ` 'search engine optimizer reject buying or renting of external links under the allegation, that it is also possible without financial expenses must be at the top of a Web page lists the results of the search engines to position. Undoubtedly this is the case, but it is also fact that many or almost all commercial websites a part of their advertising expenditure for renting and buying of links account.

Of course you can e.g. Link also through barter his goal, but sometimes it may be useful to some strong links to buy, for example, To the number of outgoing links to minimize or free link to sites. Links, whether as a banner or text links, you can buy almost anywhere. Some examples are Internet auction houses, special link purchase or lease link platforms, or so-called link broker who specializes have to appropriate interested seller to mediate.

If you have decided that one or several links to buy, you should have some basic rules and recommendations. First, it can be seen that links always a price. Cheap deals, in which strong links to you for a few Euros offered, are usually treated with caution. A PR 6 links can e.g. Be 50 euros per month cost, a PR 7 link several hundred dollars and even stronger links accordingly. They must not assume that a stronger, or a link with a higher PageRank, your site will also be better in the search results position. The opposite is often the case. For example, for a new website several very strong links to buy, it may happen that due to your website link growth `unnatural 'for a long time in the Google Sandbox lands. What for the ranking of your website means has already been explained.

When you link in the purchase or rental always link it, not in the scope of a filter of the many search engines. In simple terms this means that you have to make time here. For a relatively new site, for example, you should Only for the purchase of some 3 PR, PR 4 or maximum PR 5 links. Anything else could the ranking of your site more harm than good. Did your page after a few months, even a PageRank, according stronger links can be set.

If you are final for a vendor or a Web page on which your link is set to be decided, you must first examine whether the PageRank of the page really genuine. With some tricks, e.g. PageRanks of the mirrors, it is possible to pages normally a PR of 0, for example, a PR 5 or 6 show it. It will also ensure that the PageRank and the link power of the links on the site you have chosen is actually inherited. There are some free tools that allow you to check the authenticity of the displayed PageRanks or the links themselves can check. Some refer to the chapter `links, and software tools'. Run always a so-called Backlink analysis by using the number and quality of relevant links in the back page. You can be sure that the site even in a few months the PageRank displayed hold him. Crucial here is, above all, the domain popularity, which is easy to determine. The tools necessary for this can also be found in the chapter `links, and software tools'. Moreover, the authority, the number of outbound links, and the theme of the website for your decision to play a role. A PR 6 link on a page themenfremden, on the other 50 links are available, it can quickly cause less than, for example, A PR 4 link to a related page, which only 10 external links.

An indirect way, and links to buy the sponsorship or donations. Often can be explained by the one-off cash donation amounts, eg For the development of open source software, or as a support for organizations or associations, strong backlinks. Unfortunately, such websites are now known. So you can find your fast as a link among many others. Again exactly should first be researched whether the `investment 'is really worthwhile.

Generally, it can be said that the purchase of permanent or temporary links very much to be observed. To the fast-growing market for suppliers to link slow, some search engines such websites indexed and evaluate all the links on those pages. Especially for this reason is often the non-commercial exchange of links to the link purchase, especially for relevant known and dubious sellers, preferable ...

Offpage optimization - PR services

The same was for the link lists, can be found at most PR or PageRank services. Since a few years, the source code or the calculation method of PageRanks cracked, and thus was publicly known, there are a variety of different service providers.

By deploying a button, or just an HTML codes, the visitors through a graphics current PageRank your website. At the same time, in many cases to visitors Or ranking competitions by your website and registered compared with other sites.

For inserting the button or codes, which are also an additional, mostly strong link back to the operator of the PRDienstes means you often receive only a weak link on any sides. Google has the operators of these services only recently officially announced `battle ', and related websites partially removed from the index.

They should be more than careful, and not on the advertising tricks of the majority of PR services to be taken. The damage to your website by participating in eg Several of these services can inflict, can be far greater than the potential benefit. They want the visitors to your website still described the service, limit yourself to one of the already established manufacturer ...

Offpage optimization - Link Lists

Another possibility gladly used to quickly and easily change the ranking of a website to improve entries are in link lists or guest books. There are many providers who are advertising your website in countless lists which link. The resulting success is mostly just the fact that your mailbox by publishing your contact data inundated with advertising messages. Link lists are often just wahrlose accumulations of several references from a variety of areas. The ranking of your site is determined by an entry in these lists hardly positively influenced. It comes even before that you have an entry and harms the ranking of your website deteriorated. Google itself notes in its guidelines explicitly pointed out that one of PR services of any kind at hand, as the ranking of a Web page primer Through links from web spammers or link programmes may be affected adversely. Amer Board is therefore in some selected Submitted web catalogues, and some texts in article directories publish. If you have the `reputation 'of your website not hurt them, you should take the entry into simple linked-list guests and countless books omitted. Even the `environment 'of your site is a ranking factor. Links from barely lists can be easily verified as Neighbourhood bath or `bad neighborhood 'and devalued accordingly ...

Offpage Optimization - press

Press services or distributors are primarily meant, important messages or news to publish, and to relevant addresses, such as Publishing houses, editors or journalists, reported. As a rule, you can in a press release, for example, On the launch of your site, including a link or a contact address, under which the relevant page or related information. It will give you under some circumstances, while external links to your site. There is a whole range of providers, which are advertising a message to the most important e-mail addresses to send or agencies. Distinguishing between the need to free and fee-based services.

One can not generally say that the latter are always better. But the fact is that just at the free services often barely selected and investigated, and daily countless new messages or send messages. The fee-sellers are often the messages on their contents checked and assigned specific topics. Often, then several messages to a single topic specifically addressed to that specific, eg Journalists or editors of newspapers or magazines, sent. You should, before a press release about a specific content provider, consider what you want to achieve.

If you only a few backlinks want to generate, there are perhaps some already established free press distribution. However, if you want to achieve that perhaps the interests of one or another editor awakened, it should be more on a paid provider backed ...

Offpage optimization - Social Bookmarking

The term Social Bookmarking is the management of public favorites, so favorite websites. Meanwhile, there are numerous social bookmark services. Among the best known include Mister Wong or del.icio.us. On these pages, the user's own bookmarks online manage, and the other members to see or even subscribe. In addition, each member his own list of different categories and the importance of individual bookmarks. Each category is one of many members rated `ranking 'on the appropriate topic, similar to an editorial gepflegtem web catalogue.

Since the Social Bookmarks countless members of constantly re-evaluated, plays the `ranking 'within these lists, indirectly, a role for the ranking of your website. The search engines use some data or information from the Social Bookmark services and allow it to part with one's own rating scheme introduced individual websites. To what extent this happens, it is difficult to say, since on the other side is possible, the lists of Social Bookmarks manipulated by example Members of the groups within individual services and is specifically trying to individual bookmarks to `push '.

Resümierend it can be said that the Social Bookmarking, an essential part of the Web 2.0, in the future more and more in importance. Who his side fully optimize for search engines you have to deal with this issue ...

Offpage Optimization - weblogs

A Weblog, often simply `blog ', in the strict sense is a kind of` digital diary'. Meanwhile, however, there are blogs on various topics. The difference with a normal web page can be at first glance often little more concerned.

The benefits of a blog is a simple way to create and easy-to-use user interface. They make it to the owners of weblogs and the visitors easily, new information, entries and comments. The first weblogs are the mid-90s erstanden. Since that time, the number of blogs soared. Thus, they are also on a topic within the search engine optimization. One advantage of the Weblogs are the so-called Permalinks. As a rule, all entries in a blog made to archived. These can be any time at a specified address and called linked. For many Web site operators also interesting is that in weblogs entries, or even comment. This functionality, and especially the timeliness of blogs classify the search engines is very high.

The new link options, many weblogs a high PR and a high degree of authority. External links from blogs are very popular this reason, and help the ranking of a website to positively influence. What the visitor and for the timeliness of use of blogs has on the other hand, unfortunately, many operators to adopt because of their blog entries of spam.

If you pointed out, external links from Weblogs to receive, you should really bother making a meaningful contribution to comment or ...

Offpage optimization - Article Directories

Another free option, external links to receive, article directories. As the name implies, these consist of articles and texts on the various topics and areas. For most directories can register and then to write articles themselves. In return for the `delivered 'content, we can then often one or two links in the article. The advantage of these links is that they are directly in the text, also in the content set. The disadvantage of some directories consists in the fact that many new items daily written and set. The written text itself so often slips down relatively quickly.

Meanwhile, there are numerous directories. As with Webkatalogen, you should also make sure that you have written articles on any bottom with a PageRank of 0 or 1 is published. 83 If you have a little research, you will find articles directories with a PR of 4, 5 or 6. As this article directories, which already have been published elsewhere, as a rule refuse, it is advisable that when choosing a little time to take. The letter text of an individual is not so easy and should be `rewarded '. As a little help can be found in the chapter `links, and software tools' some links to lists of selected items directories.

Since, as already described, the content of a page ranking of the most important factor is that you can, for example And the website operators offering related pages, a text or a guest article to write. Certainly, many will also willing, in return, a link to your page to put ...

Offpage Optimization - web catalogues

Especially at the beginning, it is often difficult to link appropriate partner for a website. The page is still unknown and has a PageRank of 0 The entry into various web catalogues is a good way to quickly, easily and usually some free Backlinks can generate. There are web directories with or without a back link `duty '. Web catalogues with Backlink duty take your entry only when you have a counter-link on your site.

Two other criteria in the selection of appropriate directories are Eintragsart and the PageRank of the page on which your entry is published. Many catalogs link to the entries only indirectly, some also use so-called `NoFollow links', ie Links, the search engines were not pursued, and therefore no relevance to the ranking of the linked web pages. Also at the PageRank of each Webkataloges you should not be deceived. Often, e.g. The home page of a PR 5 or 6, the actual pages on which the entries will be published, but only a PageRank of 2 or 3 Advisable, if your website to start in about 15-25 selected web catalogues and directories Register. Links to some lists of various web directories can be found in Chapter `links, and software tools'.

There are two directories and catalogues, to your special attention. The first is the Open Directory Project (ODP), also known as DMOZ, and secondly the Yahoo directory. The authority level, and hence the ranking of a Web page, which was entered into the ODP, increases in general within the shortest possible time. The entries in the Open Directory Project, as described earlier, special editors manually entered. Google and other search engines assume that a site which is listed at DMOZ, a general benefit to their visitors. Accordingly, this item is reflected also in the ranking of the site. They should also try, in any case, your page to the ODP again. In general, it takes several months before your listing request is being processed.

In contrast to the free entry at DMOZ, is an entry in the Yahoo directory for the moment only for a fee. It is expected that this entry is not such a positive impact on the ranking of a Web page impact, as a listing in the ODP. However, you will receive an entry in the Yahoo directory some strong backlinks which the ranking positive influence ...

Offpage optimization - Link Building

The correct link building is one of the most important factors in a search engine optimization. Ever since the existence of ` 'the Google Sandbox, a filter within the algorithm, which is new websites, too many links in a short period of time to get into the` sand box', ie At the end of the relevant search results pages transported, it should always be a natural course or acting link building should be pursued. There are a few points that you can use in each case should be handled.

Particularly with a new site should initially not too many or too strong external links to be set. A normal website that e.g. After a few days of 50 has strong links back, and therefore theoretically should be listed high, the search engines, especially Google, abgestraft.

In the initial phase, only a few, i.e. 2-5 links per day to be set, since the total number of links is not abrupt, but steadily increase. There is a website operator, an eventual stay in the Google sandbox with its optimization work account and from the outset it set many links as possible. It usually lasts no longer than 3-6 months, until the filter is not more attacks, or until the site is back to normal gerankt.

What kind of link building you choose is up to you. It could very well be a unique stay in the sandbox lasting influence on the ranking of a Web page. With a natural link structure, it is also often the case that a high percentage of the external links from one area, for example, Footer from the linked websites comes from. Because this area is often purchased or swapped for links used, these links from the search engines is generally not been as strong. The same is true for many links of the same domain name or IP address.

Since e.g. Footer links most often on all websites an Internet presence are displayed, while apparently get very many links at once, their weight or strength, however, is often very low. Always try to get a natural ratio of both sites, as well as the link positions on these pages. The latter also applies to the link texts. If a high percentage of the link is the same text, the search engines assume that you as a website operator initiator of these links. The link text in a natural link building because of the topic of a web page is always similar, but never the same. Provide the operators of other Web pages are always a few different ways to text selection. If you link text proposals directly to your website, this means that the link text at regular intervals need to change.

The destination address of the external links should be a natural link structure to ensure also vary. Many natural links link e.g. Only to a specific sub-page of a website, as only the information contained therein for the visitors to the linked site could be interesting. On

Entering an organized link-building course, you should first few links on the home page of your Internet presence, but you can occasionally even some sub-pages. Since the Power or PageRank, your website has provided a well-structured internal links, evenly distributed, have external links to the sub-pages, indirectly, on the ranking of your home page or the entire site. The ratio of incoming to outgoing links references is also an important ranking criterion. Search engines recognize certain patterns within the link. If you e.g. In a barter a link to the website of your partner and this put a link to your website inserts, as the search engines these networks, including reciprocal link. They assume that there may have been exchanged links and evaluate both links may accordingly.

Although some experts contend that a mutual and reciprocal link brings no problem, in some search services to be respected, whether they are swapped, or purchased real links. The PageRank, with a ranking of the most important factors of search engine Google, may by the Trade and buying of links easily manipulated. For this reason, it is increasingly being done, possible links to identifying and discussing. In summary, it can be said that an acting course link building never clearly structured and linear. It comes e.g. , A website that because of a new and genuinely interesting article, many back links in a very short time. On the other hand, it can happen that one page long time no new reference gets. The same with the link text and the target addresses.

If you have a natural link building `simulate 'wish, it is a few basic rules, you should not be structured to proceed, because search engines such structures. Significantly easier build the link, if you have multiple websites operate and the links or addresses may vary ...

Offpage optimization - Topics Relevance

Increasingly links are rented and purchased. Therefore, the relevance of the issues linked web pages all the more important. The search engines assume that about relevant links are not purchased commercially, but were added to the visitors further information. Look possible always to the left of pages to get their issue on the topic of your own website compliant.

The aspect `topics relevance 'is not necessarily an exclusion criterion for example, For a link exchange. Also, the links themenfremder sides have a positive impact on the ranking of your site ...

Offpage optimization - The authority of the links

The search engines share Web pages into various levels of authority. These indicate whether the page is insignificant information on a single topic, or whether they eg An authority is, what the visitor to data or information on a topic or subject area can provide. Links from expert or authority pages are correspondingly heavier weighting.

They recognize authority pages that they many links to related information. At the same time, on many other sites on the authority out ...

Offpage optimization - the number of links

The number of web pages, with a link to your page, is also an important ranking factor. It should however be noted again that only a single strong link, eg A link from a page with a high PageRank, or the authority of a reference page, a larger influence on the ranking of your site can have as many weak links. Nevertheless, the total number of outbound links is important. Differences will link popularity, domain and the popularity IPPopularität.

Link popularity: The link popularity or `link popularity is the number of links that point to your page. The link popularity, e.g. And all links are counted, the website of the same expected.

Domain popularity: Under this concept, `Domain Popularity ', is the number of links from different domains on a website.

IP popularity: The popularity or IP IPPopularity ` 'Only those links counted by different IPs or IP blocks. In general, all three are factors in the ranking of a Web page with a.

The exact weight of each Popularitäten is difficult to pinpoint. The link popularity, only the number of links, should be evaluated on the weakest since one example Footer with a link to a great site easily receive several hundred links. The popularity of IP e.g. The problem is that some ISPs for many consumer websites have the same IP. For these reasons, the most popular domain weighted ...

Offpage optimization - Link Age

The age structure of the external links for some time playing an ever-increasing role. To make a sudden rise in the search results, eg Through short-term rented or purchased links confront it usually takes some time to take account of external links.

If you buy or rent links, you have to assume that they sometimes only after 3-6 months, its full effect. This period should be in your investment planning, in any case involve, especially since it is also up to three months, until the inherited PageRank of the new links on your page ...

Offpage optimization - Link position

A Web site is provided by search engines in more or less important areas. The most important part of a page is the actual text, so the content. A link within the text will be higher than, for example, A link somewhere on the edge of a Web page.

Generally, the following classification: A link within the text or in the header of a page is the highest weighting. Then follow links on the right or left of the content. So-called Footer links, links at the bottom of a Web page that will not be evaluated as strong as many website operators use this place, for example, Swapped, and thus not `real 'links accommodate. The same goes for special pages, for example, Partner sites or link pages. These links are usually issued by the search engines devalued.

Try your link in the actual content of a foreign side to place. Write e.g. A brief information or advertising text, which is linked to one or two keywords your page contains. This can be your link partners as an alternative to a simple text link offer ...

Offpage Optimization - link text

The text, it is linked to your website is of great importance. The link texts facilitate the search engines, the topic of your site more accurately isolate. In addition, they indicated that under the link text contained keywords, relevant information on your site can be found.

An external reference should therefore always one or more important for your site keywords. Even the position of each keyword within a text link is important. Use keywords always important to the beginning of the link text, rather less important to the end. Generally, the link text of a length not exceeding 2-4 words, as the value of the respective links in the `Power 'on each keyword will be distributed.

As in the chapter `domain name ', the name of your site one or two most important keywords. Many website operators, which link, simply use the domain name or URL as the link text. If these key words, your side in these search terms listed above. Thus the important keywords in the linking external use, it offers to link some link text templates or suggestions on your website. The visitor can then e.g. Just the HTML code of the links copy, and on its website. It is important that the link text to your page vary. A repeatedly auftauchender identical link text will be evaluated negatively. Use a different link text by clicking simultaneously optimize multiple keywords and synonyms insert ...

Offpage Optimization - PageRank

A criterion with the external links are reviewed and compared, is the so-called Google PageRank. He defined simply put the `value 'of a Web page based on the number and quality of links to reference it. The PageRank is in a scale of 1-10, where 10 the highest value. A detailed explanation can be found in the chapter `Google PageRank '.

There is a big difference whether a page with a PageRank PR of 2 or a link to your page sets, or a page with a page rank of, for example, 6th The link of the website with a PR of 6 is used by the search engines are usually higher than the weighted PR 2 link. In general, almost every link a positive impact on the ranking of your website. However, you should always try, the most powerful to your page ...

Offpage optimization - Link Review

External links, i.e. Links from other sites are for a good ranking of your site is essential. The number and quality of links is the most important external ranking criterion.

Each of these links, however, by the search engines or weighted differently evaluated. How can it be that a side with only a few, but high-quality and strong references listed above, as a site with very many, but rather weak links. Some factors that the search engines in the assessment of individual links consider to be at this point briefly presented. The following points can also help you, even links to evaluate. This can, for example, in evaluating and making appropriate link partner of advantage ...

Onpage Optimization - Stemming

The use of words of the same tribal word in the text of a web page is very important to proceed with the appropriate keyword to be listed high. Just as the presence of synonyms, this shows the search engines that the entire contents of your page is relevant, and the seeker can offer extensive information ...

Onpage Optimization - synonyms

If your page with a specific keyword positions on the front want to be listed, it is important that the text of the website synonyms, Similar or related terms. It makes no sense, therefore, not a website on the theme of the content related keywords to optimize. Focus always on some key words or phrases and try to put them specifically to optimize ...

Onpage Optimization - page number and size

Another internal ranking criterion is the size of the Internet present, The number of pages and the size of the pages themselves, the search engines assume that on a website, which consist of 100 individual pages, or more detailed information about the search word to find, as on a Web page, with only 10 or 20 individual sites .

Try the search engines as much information as possible. Make sure, however, always insisted that really every single page contains personal information. The general rule is that the quality of a web page is much more important than the quantity or size ...

Onpage Optimization - Individual Content

As in the chapter `content 'mentioned, the content of your site individually and uniquely. Only if the crawler search engines really new information, your page will be listed high. If the same or similar information already elsewhere on the Internet mentioned, this is not the case.

In almost identical in content, e.g. Texts if you simply copy other websites, their side because of Duplicate Content abgestraft and possibly removed from the index. You should always make sure the search engines and the visitors to your page individual information.

The individual and unique content is one of the most important, if not the most important factor ranking ever ...

Onpage Optimization - Ranking Other factors

In addition, there are numerous other internal, ie Seitenbezogene ranking criteria. Experts believe that only Google affecting well over a hundred items in the listing of search results. Of course, not all of these factors explicitly mentioned or explained. There are only a few of them briefly touched ...

Onpage Optimization - Outgoing Links

Another important factor, ranking outgoing links, especially on important and themenrelevanten websites, so-called authorities. The links provide information about the content of your site and show the search engines that they are an important part of the infrastructure of the Internet.

In general, no operator of a website about it, if you have a link to his site. Make sure, however, always insisted the issue of an appropriate site and legally completely safe link text. Also, only a few selected links on a Web page insert, as with any other link the strength or the `Power 'of the other links on the page declines. To link power as high as possible and consistency, you should never be more than 10-15 outbound links on a page ...

Onpage Optimization - Domain Age

As briefly mentioned, is here once again be noted that the domain age is an important ranking factor. According to the principle of `old is gold ', are old and well-known websites or authorities in the list of search engine results preferential treatment. The search services assume that pages, which already exist for some years and has been regularly updated, the seekers extensive and can provide better information, as for example, Still a very new page. On the other hand, there is very often a new page so-called `Boost ', ie They will be for a short time gerankt above average.

Of course, you can use the domain age is not directly influence, but you should as soon as possible to register a domain and for the search engines subscribe ...

Onpage Optimization - Index

If your website to be found, it must be by the search engine indexes, ie In the index. The Index ensure a website is therefore an important component of a successful search engine optimization.

This search engines your site can read and evaluate, it must be possible errors in HTML or ASCII text format. The other main prerequisites for optimum Indexierbarkeit were already in the previous chapters discussed.

A clear and simple structured internal linking with the help of a clear site navigation and facilitate a Sitemap crawlers of search engines to enter your complete Internet presence. You should always make sure the number of layers of your site to a minimum. The `flat 'hierarchy your side, the better they can be indexed.

First, however, they must first of the search engines or crawlers found. If you publish a Web page, you must at the individual search engines to register. Only then `know 'the search engines by the existence of your site and they will sooner or later visit. To accelerate the process, we can also some external references and links back. So the search engines find your website alone. An additional application is usually not necessary any more ...

Onpage Optimization - Other meta tags

There is a whole series of other meta tags, of which only the main briefly explain ...

Robots: Robots should also tag on any website missing. Using this tag, you can use the robots or crawlers of search engines give specific instructions, such as Whether your page to crawl, or not. Would you like to give commands differentiated, that is the search engine crawler certain parts of your address or Web site for the indexation lock recommends an outsourced text file, the so-called robots.txt.

Revisit: Also that day is aimed at the search engines, indicating in what time intervals to Site of the robots to be visited. If your site constantly updated, you can instruct the crawler, these every day for new information to sift through. If the content rarely changed, for example, ranges Also an indication that your site every 2 weeks to visit. The statement, however, is merely an indication. When and how often your site actually crawl, one can hardly influence.

Caching: This meta-tag to prevent the caching of a Web page and should mainly sites, which are very frequently updated be used. Content-Type: Using this tag is the nature of your site and the character set used. The information contained facilitate the crawlers including the reading of umlauts.

Language: Here, the language used on your site mentioned.

Page-Topic: This day defines the topic of a Web page. Is this not quite clearly, or may have different topics in question are also multiple answers possible.

Page-Type: This is the type of your website added. Is it e.g. Is a pure information page, a guide, a directory, or a community, you can search engines to this day with appropriate information.

Author: The author tag includes the author or for the content of the website responsible person.

Copyright: At this point, you can click the copyright, so your rights to the content of your website attention.

There are some other meta-tags, which for the ranking of your website but no role. Really relevant anyway only the tags Title, description and any keywords. What is important is that you on every page of your website different and precisely tailored to the respective content metadata. The same meta tags on all sides should be avoided. Help with the preparation, with so-called `meta tag generators', which the personal information automatically convert meta tags. Some generators are recommended in the chapter `links, and software tools' ...

Onpage Optimization - keyword analysis

To select the right keywords, is a so-called keyword analysis. It serves mainly to relevant search words or phrases to search, in which your site has the greatest opportunities, far above in results of the search engines appear to be.

First, you have a number of possible combinations of keywords or superior to the topic of your site, and where your site will be found.

Then you should investigate how many pages already under the respective search terms listed. Simply enter terms into the various search engines. It can easily be several million results for a single keyword. The more hits each search result, the more difficult it is usually too, with a new website under the corresponding keyword far to be listed.

Analyze the Web on the first three positions or on the first page. These competitors can easily find out under what conditions, eg PageRank popularity or domain, and your website there could be displayed.

Moreover, it is helpful to focus on the HTML code for these websites. In the meta tags you can often find all the relevant search words. It will give you new proposals for your own choices. Often it is in the initial phase more sense, a website on keyword combinations to optimize. Again, as described above.

In this way, you can determine the number of your selected keywords targeted minimum. Among the remaining words or word combinations, it is to filter out those under which most commonly sought, and at the same time not so often by the website operators to use your competitors. There are numerous tools. Some show e.g. The number of relevant searches within a certain period. For other programs will also evaluated, which, or how many other websites on this keyword optimized, or, as the chances of a new page, also under the respective Keyword high to be listed. A small selection of these tools can be found in the chapter `links, and software tools'. Finally, let in a few keywords, or combinations. On top 3-5 key words are sufficient. Later, when these well-listed, you can more acceptable.

The keywords are the basis of any search engine optimization and must therefore be well chosen ...

Onpage Optimization - Meta Keywords

The keyword tag now has no major impact on the ranking and is used by some search engines even completely ignored. It has, however, no negative consequences if you use this day and the most important key words in your Web page. Here, the text should be no more than 150-200 characters. Do a listing too many, and not the content of the page corresponding keywords. Many search engines recognize these targeted attempts `manipulation 'and punish them under circumstances. Of course, you can also insert keyword combinations, but you should be careful, which is not a single word so often.

Onpage Optimization - Meta Description

The Description and description of the content appears on some search engines also as a text in the search results. Add also a detailed and comprehensive summary of the content of your web page. This should encourage seekers, the site to visit.

The Description tag should not be longer than 200 characters, as he also may be reduced. Try accordingly, the important information of your website at the beginning of the text.

Onpage Optimization - Title

Actually, no meta tag to be at this point short of the Title tag described and explained. The title tag is one of the most important Onpage ranking factors. In addition, the text in the title tag on most search engines in results as the description of the site.

They should therefore find a text, while the search engines for relevant keywords, and the other, and clearly understandable summary information for the seeker. Both together is often not so easy, because the text in the hit show, after a certain number of characters automatically reduced. Restrict therefore to about 50-75 characters. In no case may simply important keywords strung together without substantive context. This is by the search engines negatively. Try approximately 5-7 key words in a logical context.

Especially in the early days it makes little sense anyway, a page on many keywords optimized. The important thing is, in any case, that the text also appeals to the visitor, because what brings you a high ranking, if your page is not `clicked '.

Onpage Optimization - metatags

The meta-tags are HTML codes, so-called meta-data about the site in question. They are mostly in the HEAD section of a page, and summarize the most important information. Furthermore facilitate the meta tags crawlers targeted browsing by giving them important instructions. Since meta tags for the visitors are invisible, repeatedly tried to as much information as possible to be inserted.

Especially in the early days, the search engines due to the many and often in no way themenrelevanten major problems information, assign appropriate pages. Above all, with the keyword tag has been driven a lot of abuse. For this reason, the importance of most meta-tags for the ranking of a Web page rather low. Except for two or three tags ignore even some search engines the information completely. Nevertheless, some meta tags very important and must on any website missing ...

Onpage Optimization - Timeliness

A website, whose content regularly updated and expanded, will be held at the search engines more attention than one page, the contents remain unchanged for a long time. Updates to induce crawler search engines to your site more often to visit. However after repeated indexing your website any new content is found, the period between crawler to extend their visits.

Of course you do not use the entire contents of your Internet presence constantly changing, but you should try to make your website the information to continuously expand. Meanwhile found in ever greater extent so-called `Open Content 'on the Internet. This information is not protected and can be used for their own purposes. It is not enough, that content easily and on their own website. The so-produced content `double 'is used by the search engines negatively assessed and abgestraft. The freely accessible texts rather should serve as an inspiration, even new individual information.

If you do not have time to constantly generate new content, offers, among other things, the possibility that the visitors to your web site in the creation of new content with. You can e.g. A forum or a board of opinion in your web page, and the newly written contributions or comments on the homepage. Even an embedded news blog can help your site up-to-date ...

Onpage Optimization - Outline

Not only for the visitors, but also for search engines, it makes sense to the text of a website to clearly divided. In general, you must make sure your entire page in a accordance with the usual standards defined HTML to create. A `unclean 'HTML code may be a search engine crawler you arrange your website to leave and no longer crawl.

When the structure of the website contained in a text, especially the HTMLTags H1 to H6, which is different headlines declare. H1 defined the main headline, and so will the typeface also the largest. If you do not like the presentation, for example, you can Using CSS formatting change and the design of your website adapt. Use H1 for the actual headline of your site. It should contain the most important keywords. The H2 day, and possibly the H3 tag allows you to further your page headlines. An additional integration of the tags H4 to H6 is not necessarily required.

Another means of a specific part of a highlight text, text labels, such as Bold or underlined. Try the text understandable and clear structure. Particularly excellent passages and, above all titles are provided by the crawlers of search engines as important and help you with the higher contained keywords to be listed ...

Onpage Optimization - Images

Because, as already mentioned, search engines images or other graphical elements can not evaluate, there are some important points that we observe in their involvement. Previously, however briefly pointed out that the data size of images and multimedia elements are kept as small as possible. Although now a majority of Internet users use DSL, there are still many users who have a modem or ISDN connection. An image for this reason should not be larger than 100-150 KB.

Add each frame an alternative short text (Alt attribute), a further description (Title attribute), or even a link to a detailed description (Longdesc). The alt-attribute has actually function, the visitors an alternative text to offer, if for some reason the respective images are not displayed. This is e.g. The case if certain browser files, or not support the automatic loading of these files is not enabled. The old text will also appear when the visitor with his mouse on the image goes. It should therefore be a brief summary or description of the displayed image. The title attribute allows each element or image, and additional explanatory information to add. In general, should at least always the Alt attribute used. When the visitor and comprehensive information related to a picture available, I can let you Longdesc attribute, and a link to a description. For visitors using this link does not appear, it will, in addition a simple text link next to the image.

With the help of these attributes, they provide the search engines valuable further information about the content of your site, which is ultimately a positive impact on the ranking ...

Onpage Optimization - Texts

The only for search engines really interesting content is the text of a web page. See pictures or animations are beautiful, search engines can but usually not evaluate. Nevertheless, all the media should generally used as much, with the help of pictures or animated graphics, or simply better explain. For their use, however some rules to be observed. These are in the chapter `pictures' detail.

Try, in any case, as much as possible individual text on your web page insert. The more text, the greater the chance that a keyword or with combinations of two, three or more key words to be found. As in the chapter `technology of the search engines', a search engine analyzes the text of your website, and assigns the terms contained simplified said certain keywords or keyword combinations. Experts estimate that only about 15 million actually used search words. If we compare this figure with the number of existing Web pages, this means that for every query often millions of results.

To the frequency of listing your site to increase, you can, for example The number in the text of your Web page contains keywords or terms. You should, however, an important principle never ignore. Also, the text of your site is primarily devoted to your visitors. Make no error, the text exclusively for crawler search engines optimized.

A related sense, individual and informative text is the basis of any optimization work. Search engines are very sensitive to explicitly targeted for text manipulation. They should refrain from the search words with which you want to be found, often above in the text of your site accommodate. This so-called `Keyword Stuffing 'can lead to your site from all over the result lists and the index dropped. The keyword density, also called keyword density allowed a value of 4-6%. A tool for identifying the exact keyword density can also be found under `links, and software tools'.

You can concentrate on the optimization of the text of your website, not only to a few keywords, but also try, combinations of several words in the text accommodate. It is sometimes advantageous to optimize a Web page in several phases to subdivide.

Because it makes little sense to have a new website to really highly competitive keywords to optimize, it is just the beginning advisable first to less often optimized keywords or combinations. As such terms or words can read the chapter `keywords'.

Another error, which may even may cause your page removed from the index and `banned 'is a so-called` hidden text'. There are several possibilities, certain parts of a text to be inserted or modify that the visitor can not see them, the crawler search engines but recognize and evaluate. Widely `white letters on a white background. What at the beginning still worked, and punished now recognizes each search engine.

In addition to the involvement of relevant keywords, there are some other possibilities, such as the text of a Web page can optimize. Refer to the chapter `breakdown '...

Onpage Optimization - Content

The concept of content for the actual content of a web page. The content of a page describes all the media presented, text, images, audio and video files. The content of a site is by far the most important factor ranking ever. Who his website regularly with individual content fills, it is also in the results of search engines listed high ...

Onpage Optimization - Internal Linking

A clearly structured internal linking is the ranking of individual pages of a site is essential. Actually, this should be accomplished through the use of a structured site navigation or a sitemap guaranteed, but there are still some additional possibilities, and to improve internal linking.

Again it is important to ensure that no possible graphical elements and images as a link be used, because the search engines as the text displayed in the rule may not recognize. If it can not be avoided, use in each case, the old and the title attribute to the search engines additional information. Even with simple text links, it is recommended that the title attribute to use. Add referring also useful links within the content of your site. The visitors, who are interested in a particular content should be within the text or at the end of related links or articles on recommendations thematically similar pages out. Use keywords here too important for the internal linking.

For large web sites are offered in addition to lists or thematic Sitemaps to create. The integration of a keyword list has also many advantages. This is with the respective websites querverlinkt. As an additional navigation you can help the visitor a so-called position indicator or address bar. Based on this display, he sees exactly where he is currently located, and can be used with only a `click 'the layers of navigation. Take also the possibility that compulsory Footer links like `Home ',` Sitemap' or `imprint 'keywords within the link text to be used. Write So e.g. Keyword `Home ', or use this exceptionally graphic elements and add appropriate Alt texts.

Due to the above measures, they can change the ranking of your website. Moreover, it is easier for the search engines, based on the content of the linked pages indicative link text, the content of your page thematically assign ...

Onpage Optimization - Sitemap

A site map is a hyperlinked list of all sides represented an Internet presence. For large projects can be the real Sitemap also on the representation of the hierarchical structure limited, with the sub-categories and lower levels of the pages on other Sitemaps listed. But even at smaller sites do an additional integration of the hierarchical structure makes sense.

A Sitemap has two main functions. First, it offers visitors a website a simple glance or a simple navigational ability. On the other hand, it provides the search engine relevant information on the website. Sitemaps specifically for search engines are also meta Sitemaps. Here is information (Relevance, Relationships, last update, last changes) now in the form of a standardized XML file stored on the site and linked to the server load.

This allows the search engines crawl the site quickly and effectively through. Meta Sitemaps can be easily with special programs. Some selected software solutions can be found even under `links, and software tools'. Some search engines, it is possible to the Sitemap a website directly. It will then automatically at regular intervals fetched. At Google, one had its own account. In addition to many other useful information and tools, you have the possibility of a special Sitemap. In addition, it is useful to the visitor to a site a separate Sitemap or navigation aid. This should preferably be not more than 50-100 links, and also directly from the home page links. If your site more than 100 pages, it is better to the Sitemap hierarchical structure. A subdivision in a Hauptmap and one or more sub-Maps is also here, similar to the preparation of the site navigation, an advantage. With the help of a sitemap can also try to get the number of layers of a Web page to a minimum. This has, inter alia, the advantage that the PageRank evenly on all pages will be distributed. The linking of the various sides should, as with the navigation, relevant keywords.

A Sitemap helps visitors to quickly and purposefully on your website and to manage. In addition, it will help ensure that your website is fully and regularly by the crawl ...

Onpage Optimization - Page Building and Menu

The page structure or the menu of a web page must simply and clearly structured. Avoid false links. These so-called `broken links' can be found at frequent occurrence even for exclusion from the index of a search engine. Appropriate tools to detect broken links can be found in the chapter `links, and software tools'.

In general, the navigation of your site in such a way, that the visitor can easily understand. If this is the case, is also a search engine crawler no major problems, every single page of your Internet presence.

A website is always divided into several levels. The first level is the home page. On the home page are usually all the important main categories linked. About the sides of the main categories should again go to the homepage, as well as on the sub-categories of first order arrive. Of these sites, visitors must turn to home, to the main course categories and sub-categories to the second order. This structure should be phased up to the last level. The last level of a web page is linked to the home page and with the second-level pages linked. For small and medium-sized websites you often comes with a total of two or three levels, but also large projects can be divided into three or four levels up to grade. With the help of PHP or CSS e.g. This can be arranged to structure and manage.

A higher number of layers is to avoid it, because the crawler search engines were not automatically follow any path, but after a certain level, the site may leave again. This can also be carried out by an external link individual sub-pages. During the implementation, thus creating the menus, you should make sure, if possible HTML or text elements to be used. Avoid scripts or Java applets, there are always some problems crawlers have a chance to interpret. In addition, there are browsers and visitors, the Java scripts can not view or disables the function. Even pictures are as components of a menu rather inappropriate. Take advantage of the opportunity to the visitors, and hence the crawlers through meaningful focus on the content of the linked site targeted keywords, information about your site. The link texts of internal linking also impact positively on the ranking of a Web page. Even with a flash animated navigation should be avoided. If you do not want to waive, you must search engines on an HTML-based alternative. For large projects, it can also help to set a so-called Sitemap, ie Just a structured overview of the various websites that idea. Refer to the next chapter. With the help of a sitemap can be quickly and easily reduce levels and avoid unnecessary links. Moreover, with a Sitemap PageRank of a web page to evenly distribute all pages.

Simply, it can be said that a visitor to the Web site of a concerted and his natural behavior customized navigation also will lead the search engines each site faster and better grasp ...

Onpage Optimization - Url building

A Url, `Uniform Resource Locator ', identifies a resource on their localization and its access mechanism. Url consists of a reserved and non-reserved characters.

The correct URL-assembly and the right Url Design is an important prerequisite for a good placement in the results of search engines. Even before the creation of a Web page you have on the structure and the naming of the individual pages in mind. They should, after a Web search engine indexed and listed in the results will not be changed.

Generally, it can be said that the Url design, just like the entire site, not specifically for search engines, but primarily for the visitors made. You should always understandable and the content of each page Urls define appropriate. Use important keywords and avoid hardly understandable figures and character combinations. Some keywords can be combined with a hyphen. Of the search engines, they separately. File names should not necessarily in the Url appear, as this example A subsequent dynamic conversion of a static site difficult. Also parameters are rather inappropriate.

Another mistake is often too complex or `deep 'link structure. The crawler search engines index and often pursue only the first levels of a website. Moreover can be clearly and simply structured Web pages easier to modify and adapt. Are the links completely, may also ease an entire directory to another or postponed new level.

These so-called `talking 'Urls, i.e. Clearly structured and easily defined or addresses, are used for search engines as well as for visitors to a page of advantage. The search engines may be due to the keywords in the Urls quickly the topic of the page limit. The visitor in turn knows immediately what it expects and is not due to a confusing acting Url off ...

Onpage Optimization - Important Web

Microsoft FrontPage: according to the principle established WYSIWYG HTML editor Microsoft FrontPage is the most widely used editor ever. He is particularly characterised by its clear and easy to understand user interface, as well as through its site and link management. For smaller sites do FrontPage perfectly suited. In the preparation of large websites, it is due to the categorization of various content increasingly difficult to manage the sites and, pri. Here it is then it turned out to `dirty 'codes. For the older versions also lacks the support of different programming languages, such as PHP. In the new version, however, it should be managed, as far as this error be corrected.

Dreamweaver: This HTML editor of Adobe Systems (formerly Macromedia) is also following the WYSIWYG principle. The user interface and the functionality is in contrast to Microsoft FrontPage differentiated and therefore often confusing. Dreamweaver created but also for larger projects a `clean 'or valid HTML code, and also supports PHP and other programming languages. For professional Web developers Dreamweaver is recommended. For beginners, which is relatively quick and easy to create a new Web site, this program is rather inappropriate.

Adobe GoLive: Also editor of this is due to its complexity and usability more professional and experienced web developers. With the help of many different components, it is possible in the `drag and drop procedure 'the creation of complex page structures easier. GoLive supports PHP, and is responsible for the operating systems Microsoft Windows and Mac OS. The WYSIWYG editor, however, gradually through Dreamweaver, also a product of the company Adobe Systems, be replaced.

NetObjekts Fusion: The HTML editor is a product of the company Websidepros. As a WYSIWYG editor with a simple and straightforward user interface, this program is also suitable for beginners. Various features, such as e.g. Integrating the various templates, easier to operate and expand the functionality. NetObjekts Fusion supports PHP and other dynamic content and is for both small and large websites.

In addition, there are many other Web or HTML editors. Your choice should beginner in the field of web page design to ensure that the editor after the WYSIWYG principle. As the name suggests, we can work here every step of directly observe and understand. For pure text editors, the code must be entered directly, we must also know HTML. It is also advisable, in a simple and straightforward user interface to respect. The editor of your choice should also support PHP ...

Onpage Optimization - Web and HTML editors

Many of the so-called Web HTMLEditoren. An HTML Editor is a tool that will help you Internet pages in the general standard HTML format. Differences are programs in which the HTML source code or directly generated or processed, and so-called WYSIWYG editors (short for `What you see is what you get '), for which the site already during processing, similar in a browser that can be viewed ...

Onpage Optimization - Important Programming

Onpage Optimization - Important Programming
C + +: The programming language C + + is widely used in many fields. It is based on the C programming language and example In the programming of complex operating systems and the development of computer games.

Java: Java is a product of Sun Microsystems and part of the Java technology. With Java can be quickly and easily short commands or program structures implemented. The Java programming language is used by most operating systems and requires only a special embedding, also known as the Java platform ".

Perl: Programming Perl was on the basis of various other languages. Their advantage lies in the expanded interpretation of individual elements or entire units. Because of this property Perl finds application in many fields, including For system administration or in the implementation of Web applications.

PHP: Personal Homepage `Tools' or PHP serves eg For the creation of complex dynamic Web applications. Almost all websites, which generate dynamic content using PHP. It is supported by all operating systems. PHP is a server-interpreted language, and therefore requires no special specification on the client side.

HTML: `The Hypertext Markup Language 'is a text-based markup language. HTMLDokumente or websites are the real building blocks of the Internet. To view these documents require a so-called Web browser. The usually contained hyperlinks to link the various documents themselves. Only through the networking of individual documents or Web pages to the complex and structured development of the Internet and World Wide Web. Most Web based on HTML.

Typo3: The Web content management system Typo3 facilitated the release and management of websites. It is based on the PHP scripting language. Typo3 is an `open source software ', is also constantly evolving.

Onpage Optimization - Programming

A programming language is a form of a particular language depicted with the help of which it is possible computer systems direct instructions or commands to issue.

Since our natural language too general and, in many areas is not clear, you need to communicate with computer systems or computer language, which is more detailed, and the complex algorithmic structure of the computer needs.

A programming language contains a syntactic, The external form in question and, on the other semantic structures. Under the semantics is the exact meaning of individual elements of the language. Meanwhile, there are a large number of different programming languages, of which only a few are important, and briefly explained. Not the programming in the real sense belonging, at this point, the terms HTML and Typo3 explains ...

Onpage Optimization - Web hosting

Hosting means accommodation (hosting) of a Web page on a Web server. The provider or providers, mostly to pay the necessary means and resources available.

When choosing a suitable provider few things you should consider. Choose a provider, known in a long time already on the market.

In addition, you should make sure that the provider for your project necessary technical requirements. It is important to first clarify how much space and traffic, also available standing volume of data that you need for your website and, above all, at some time may be required.

Make sure it is always that your elected expand job, and `upgrade '. The offerings range from a `now Webvisitenkarte 'with a space of a few MB to its own server. An approximate value can hardly mention here. For a normal, static Web page, 50 MB of memory and 1 GB of traffic per month is sufficient. The operator of a well-attended community achieves these values may be within a few hours. Generally, we can say that with a slightly larger package often better served.

The market has gotten so competitive that the price difference between the bids often only very low. In addition, your provider should PHP and MySQL support. PHP is a programming language, which, above all, to create dynamic Web pages applies. MySQL is a system for managing databases.

In addition, some Apache modules very important. Your future Webhoster should in any case mod_rewrite. With this function, you can e.g. Easily and quickly redirect suchmaschinenfreundlich addresses, domain define a standard or dynamic, and thus often very complicated structured URLs rewritten and simplified. Since these points for a successful search engine optimization are important, it is important to make sure that your provider offers these services or support.

It should also provide you the opportunity offered, the server's log files to download. This recorded data, e.g. On which side of the link or what your site visitors found and where he has to leave again, Keyword Search, and what was used, or how long he remained on your website is intended to provide structure and the structure of your website the visitor behaviour adapt and optimize.

Some links, which will in the election server can be helpful, see `links, and software tools' ...

Onpage Optimization - Standard domain

If you have decided for a domain, you should in any case, a so-called standard definition domain, ie A single address at which your website will be found and retrieved. Do not default domain notes, it is e.g. Often the case that your side either with the addition `www 'and without the prefix can be retrieved. Search engines recognize when websites under several addresses can be found. You evaluate it as Duplicate Content, content `double 'and punish it may categorically.

A link to information related to the question of how to define a standard domain, can also be found in the chapter `links, and software tools' ...

Onpage Optimization - Domain Name

The domain name is made up of separate points by name. The last part of this name, for example. Com or. Com, is also known as top-level domain, or TLD. In the domain name system, the individual domains, that is the name chosen, together with the TLD and an IP address assigned. In this way, each one unique domain owners will be granted. The Top Level Domains are essentially divided into two groups. There are general and country endings. Country-specific TLDs are always two letters. Examples are. Com,. At or. Eu. General domain endings are, for example. Net or. Info.

Given the choice of a domain name is very important, you should take enough time and some important points. A clear, understandable and with the content of the site in close relation name can be linked to the very success of the site. The choice of domains, the following general rules: Get the new business website content and should therefore also be international visitors, the choice of ending. Com or. Biz useful. If your site contains general information on the extension. Info. The domain suffix. Org is often chosen by organisations.

Most often, however, here is the ending. Com preferred. Although all search engines treat the equivalent of TLDs is a suitable choice. Com domain makes sense. Because this extension is the most well known, it radiates seriousness accordingly. If you are not yet appropriate. Com domain, for example, because Their desire name already forgiven, choose better in each case a different domains as a different, not your domain name. The choice of this name, it must be ensured that any term or the text easy to remember and with your website associate. The length of the name may 2 to 63 characters. He should, if possible, but relatively short. A rough guideline is a length of 10-12 characters.

Furthermore, the question is whether you are important keywords, also with key words in the domain name you want to include. Under normal circumstances, this for two reasons very advisable. The domain name plays at the ranking, contrary to the opinion of some experts, by a decisive role. The second advantage of an important keywords vključena domain is the fact that many of linking with a site just the domain name as the link text. Including these one or two key words, your site under the appropriate search words listed above. It is therefore useful even, possibly a hyphen in the domain to be used, because the words in a separate rule, as two different keywords causes. They should, however, to a maximum of two words, that is not more than a hyphen. Longer domains frivolous act quickly and may be shared by individual search engines devalued.

Also, there are so-called `Type-In domains'. These are domain names, which are so significant that many of them simply Internet users directly into the browser. Although the actual users of the site does not know, is it safe for him relevant information under the appropriate Url to find. One example is the domain auto.de. Very often, it is simply the Url www.auto.de into the browser. Since such domain names is very popular and therefore expensive, it makes no further on the issue.

It is important when choosing a name to existing trademarks or copyrights be respected. Make sure exactly whether the name or part of the name does not infringe the copyright of others. In the chapter `links, and software tools' There is some useful references. Moreover, in the choice of a domain name keyword search useful. To learn in the chapter `keyword '...

Web directories - Important web directories

Web directories - Important web directories
The Open Directory Project (ODP): That even under the name of DMOZ (Directory of Mozilla) Webverzeichnis was known under the name `Gnuhoo '. Later this was due to legal problems in `Newhoo '. In 1998, the list of Netscape bought and Open Content as `project 'under the now well-known names. The open structure, in which everyone can participate as an editor, ensured that the list quickly grew and gained in importance. Due to the many volunteers was the waiting time to add a new page to other directories relatively short. Also, the quality and timeliness of the links has long been unsurpassed. An inclusion in the list was regarded as a special award for a website. The search engines used the descriptions of the ODP for their results and registered sites listed higher than others. Even today calls itself DMOZ with more than 5 million records as the largest editorial guided web directory in the world.

Both the timeliness and the quality of the links, however, meanwhile, especially due to technical and organizational problems since. As before, however, is a record of high importance, since many search engines him very positive ...

The Yahoo Directory: Yahoo directory was created in 1994 as a navigation aid for the Internet and was the cornerstone of today's enterprise. For the moment, however, the directory is only a part of the entire Internet portal. New entries are if only against payment in United States dollars. Some experts fear that the Yahoo directory soon due to a lack of quality and timeliness may be set.

Web.de: The Internet portal Web.de is just as a Web directory like Yahoo. 2005, the Web.de AG to the United Internet Group sold. The directory contains approximately 400,000 mostly German sides. Meanwhile, an entry in the directory is also only against payment.

Allesklar.de: This catalog is the biggest link directory in Germany. More than 600,000 entries are mostly local with respect to different categories. The list has now expanded to a network. Through an entry in allesklar.de will also be at Lycos, Fireball, meinestadt.de, meinestadt.msn.de and freenet.de, DINO-Online and T-Online. It is possible, a website for a limited time shall be free. Thereafter, the entry here only for a fee.

In addition, there are many smaller, regional or thematic web catalogues. Although the influx of visitors by an entry in one of these catalogs is not noticeably rising, often get good quality and strong backlinks. Another advantage of these directories is often very short processing time. Within a few days, most entries. Distinguishing between catalogues must be with or without a back link requirement. The latter publish an entry only if its a link on the opposite side inserts own.

In general, you should consider whether an entry in the directory is really useful. Do I need quickly and easily back some links, for example, For a new site, web catalogues and web directories can be very helpful ...