Archive for June 2013

Search Engines | The Big Three in view


Search Engines | The Big Three in view
There are many search engines in Germany. The most important include Google, Bing, Yahoo and Ask.com. While Google has a market share of about 90%, it creates the second largest provider, Bing, just to about 2.5%. Ask.com and Yahoo comes to just 1% market share.

These figures have since the mid-90s, is constantly changing and the search engines have evolved into one direction or another. Here is a brief overview of the three major search engines worldwide: Yahoo , Bing and Google .

YAHOO!

Yahoo is on the market since 1995 and was founded by two students at Stanford University in the USA. Actually, it was at Yahoo! Initially, a directory of the most popular websites of the two founders. Yahoo is an acronym for " Yet Another Hierarchical Officious Oracle ".

The website directory is sorted and categorized and managed by installation in the Netscape web browser to rise to one of the most visited websites in the world. The purchase several search engines, including AltaVista and Inktomi, Yahoo gave an increased market share.

The basic idea was there to offer email accounts and online calendar and manage. The registration for clients was free for the first time since Yahoo's main source of income has long been the sale of advertising banners. More customers mean more valuable ad and ultimately increased revenue.

Today, Yahoo will continue to offer free and fee-based Web solutions and services on the Internet has its Hotjobs Job Market, photo community Flickr and B2B Service alibaba.com expanded.


Bing

Microsoft's Bing is a search engine, and since 2009 at the start. Bing replaced the Microsoft Search Live Search service , and by 2012 was in beta status. Bing has indeed represents only a market share of 2.5%, but is in the U.S. with about 25% market share of the second largest search engine next to Google in German-speaking

Bing has added some important features since 2009, started as, for example, in late 2011 the geodata service Streetside for which cars equipped with cameras over several months gathering images and data. Google has a similar service called Street View on offer.

Bing as a search engine collaborates with Facebook and binds on the bottom "shopping" the online community ciao! With a, products can evaluate to the members.


Google


Google is the most used search engine worldwide since 1998 and is at the start. Since then, the appearance and functionality have changed little. Promotions by Google AdSense and AdWords , and of course because of the market share of over 90%, most companies in German-speaking optimize their website on Google. The success of Google is even reflected in the dictionary.

Also, the Google founders studied at Stanford University and developed in the mid 90s a search engine based on backlink analysis BackRub . In 1998, the search engine has been renamed to Google and has since increased its staff of 8 people today to over 50,000 employees.

Posted in , , , | 2 Comments

Lighting - Proper planning

Lighting - Proper planning

Lights and lamps to buy for your home

If you are planning to build a house, you will find that you have to know many things before the house is even built. Where outlets are needed and for which application one decides? These are just some of the questions that you should know at the beginning of building a house. Proper planning a lighting concept is very important even before the construction of a home and therefore should not be forgotten. With the help of a good architect and the right developers, so they can realize their individual desires for lighting.

Lighting - realization by the electrician

Once you have hired an architect to plan your new building, it will instruct you with a suitable developer. Furthermore, the trades such as electricians, gas plumber and plasterer be involved in a tendering. The electrician then proceeds with the client and the architect's plans for the electrical installation, which of course also includes the lighting. According to these plans, the electronics then sets the lines for sockets, lighting and all other electrical loads.

Ceiling lights, wall lights and Co.

In a new building, the lighting should be fully utilized by today. Modern, energy efficient bulbs, decorative lighting design and LED lamps should be used here. Basically, the interior lighting can be divided into basic lighting and effect lighting. The light that is called, which is used as a main lighting, that is, is always used. Here are firstly, the ceiling lights in a hallway or kitchen, and on the other wall lamps to call in the stairwell. The lighting should be large enough in any case, not the one else should be in the dark. In this area it is important to draw on the experience of the architect or master electrician. Naturally enough light should be present in work areas such as in the kitchen by lighting. Rail systems, which serve as a ceiling light, here are a good compromise. The advantage here is that you can deliver the lights or spotlights add without much effort. The other point is that you can easily dim modern light rail in order to get the proper illumination.

In a staircase can be wall lights and wall lights bring wonderful as indirect lighting levels. These can also be called via staircase circuit Eltako circuit or as a cross circuit are switched.

Of course, nowadays are also modern Bussy system in use. The young company offers to the KNX EIB actuators and associated switches in a comfortable and individually programmable system. With this technique, lamps, sockets or blinds can be controlled from anywhere in the house. Even an IPhone app is available to buy, which makes your phone to remote control for each item in the house.

Effect lighting design lighting

A modern illuminated home stimulates the mood. Whether you like it cozy and understated or failed, with different lighting scenarios you can adjust the lighting of your mood. Voice in the execution of your wishes during the electrical installation of the electric company. Special attention should be given to a selected stylish lighting.

In the living room so should be used in addition to ceiling lights as ambient lighting, recessed ceiling spotlights and wall lights for effect lighting. A switched outlet for the floor lamp next to the couch should be planned as well as a cable outlet on the living room table. Here it is advisable to have a pendant lamp to be used, which is also to advise on the dining table. Thus, the living space is complete. Going over to the bathroom mirror lighting and of course the shower and tub will be very important. Swivel ceiling spots with a high degree of protection of at least IP44 should be used here. Mirror lights or clamp lights for the mirror light other square in front of the sink.

The children's room, bedroom and office are in the middle of the room, equipped with ceiling lights. In addition, table lamps for your desk or bedside table as a reading lamp. LED table lamps have a high priority. Very long service life and low energy consumption make this lamp an absolute highlight.

Outdoor Lighting for the Home

Lights are weather-resistant for outdoor lighting, the materials are rust-free and mostly protected from UV rays. If you set the lighting concept for the outdoors, you should put a lot of emphasis on modern wall lights. These can be switched on or motion sensitive switch. A timer for the garden lighting is a useful gimmick.

If emphasis is placed on an optical high quality, so you can use ground spots for the garage entrance and bollard lights for the entrance area. High quality stainless steel lamps should here be preferred in order to still have several years still enjoy it.

Conclusion

Wall lights and ceiling lights are the classics of the lighting. To get an individual look, exclusive design luminaires are mounted. Take advantage of the modern technology of LEDs as often as possible. The great looks and low energy consumption are just two of the benefits.

Posted in , , , , , | Leave a comment

SEO for online shops: the most important tips and tricks

SEO for online shops: the most important tips and tricks

Successful eCommerce with Search Engine Optimization

If you want to successfully prevail in eCommerce with its own online shop beat the competition, can not do without search engine optimization. Too large, the number of competitors, and not just in areas such as fashion or cosmetics. Even in niche areas, the number of competing stores increases more and more, so that no shop operators must rely on his good value for money.

Finally, it is primarily about being found on the Internet at all. Most Internet users use the search engine Google to search for information or products. An entrepreneur who is not related to his shop in the top 10, it will be hard to attract customers; websites that are not listed in at least the first three search result pages to get as good as no more visitors from. So the goal is, with relevant keywords to be at the best possible location in the SERPs.

But what it is to be noted here? SEO for online shops differs significantly from SEO for blogs and other websites? Here are a few tips that will reinvent the proverbial wheel while not new, but often do not know enough attention.

Content remains king

Also in online stores good content is very important not only in blogs and Co. There are basically the same rules as for other websites also. The content should be interesting and unique, so you should not produce duplicate content. This problem occurs especially with product frequently. Product descriptions are often forwarded by the manufacturer to all the shops, and who published unchanged, is in good company of perhaps hundreds of other stores. Something Google does not like of course. It is better, therefore, to make the effort and wrote their own product or formulated the manufacturer at least for a little.

Have detailed and informative product descriptions, of course, pleased not only the search engines, but also the customers. One would like to finally know all the key data of a product before placing an order. Good product so also create trust - one more reason to early to worry about this aspect.

For large stores that have hundreds of products in the range, although it is a challenge not to be underestimated, each product to be provided with its own description. But it must not be written all at once. If the page is gradually supplemented by the important descriptions, this nevertheless brings the desired effect. One option is also to product pages that are still without text, for the time being be provided with a "noindex" tag. This can other product pages in the Google ranking benefit.

In addition to the product, there is another way to create content for online shops. The keyword is customer reviews. If customers can review products directly at the store, arises from the so-called user-generated content. He is not only free but also helps that the shop has to other trusted users, because most Internet users place relatively high value on the opinion of other users. In addition, entwine shops that integrate customer reviews into their product pages, usually better in Google. This is partly because the page thereby gaining a certain momentum, which is rated positively by Google.

Way: if you can not help but to produce duplicate content, for example, because he performs or offers products in several product categories as a PDF, should take appropriate measures to prevent this is rated negatively by Google. In the robots.txt file, you can specify which fields should not be read by the bots. With Canonical URLs to the search engine shows you which side is the "original", so that the other versions are not counted as duplicate content. These measures are simple and effective, but still remain far too often unused.

Choosing the right keywords and tags

Another important criterion that should be considered in the search engine optimization for online shops, is choosing the right keywords and tags for the various products and categories. Who can find the words to search for the user also has much better chances of being found through search engines. Clues to the search behavior of the users provide different tools, such as the Adwords tool from Google itself where you can also see which search volume have different keywords and how the competition looks like. Even synonyms or related terms can be displayed here to be inspired regarding reasonable alternatives.

In terms of the tags it is important to note that brand name and model number should be always in the title tag (and the H1 heading). Whoever offers many products in his shop, but you should try to create a page for each product unique title tag. A combination of make, model and type of product is often the best solution.

Who provides many images available, mostly doing its customers a big favor. However, we must not forget to name the images correctly and also to make use of the alt tags, otherwise Google may finally realize what the images contain.

The most important keywords should be included in the course description. But be careful, less is often more here: The so-called keyword stuffing is neither readers nor with search engines especially good.

Usability

Another aspect that must not be neglected, is the usability. Thus, the factors are meant that can affect the shopping experience of the customer. Of course every store owner wants to make it as pleasant as possible to its customers to buy from him. A clear design of the shop, a meaningful classification of the products into categories and an attractive appearance are the basic requirements to create a pleasant shopping experience for the user. Also, the loading times are an important point, which still often too little importance is attached. The longer it takes for a page is completely loaded, the higher the probability that the user loses patience and leaves the page. On the internet, competition is ultimately always just a click away.

Whoever succeeds as to guide the user through the various steps order that he feels comfortable and good advice (which the Internet is far more difficult than in store), is on a good path. Is often used, for example, the ability to directly display on the product page for more products in the store that would complement the currently viewed item well, or have purchased together with this product the other customers. In this way, you can provide the customer with the limited possibilities of the Internet some advice and maybe even make it to buy something more than he originally intended. It goes without saying that such offers can not be too pushy, not to annoy the customers.

What else is a good usability can also find out in a consultation with an online agency.

Conclusion

Search Engine Optimization for online stores designed to demanding and difficult than for conventional SEO Company websites or blogs in many ways. It is therefore advisable to hire a professional with the optimization of own shops - half-knowledge does not suffice in order to achieve results, it could even cause harm at worst. So to hire a professional web agency with experience is a worthwhile investment, with a high return on investment is achieved.

Posted in , | 2 Comments

Google attacked online stores - punishment of bad online stores

Shop optimization: Success through usability and user experience

How to make sure your online store for the new Google update

Shop owners take note: Google threatens to punish bad of online shops. Not even 2 years ago, Google added with the Panda update to its search algorithm, website owners on alert. Now the search engine giant has it in for store managers on the interactive exhibition SXSW last week, Google announced a campaign against bad online stores at.

What sounds at first glance only to harassment, actually followed a good target: Google wants to provide its users the best possible search results. And considered not only in content, but holistically. Because the best content is useless if the overall package is not right. And since Google sees especially in e-commerce glaring omissions. The new update of the Google algorithm is therefore explicitly aimed at online stores.

Typical mistake many online shops

Many shop owners make the mistake to neglect when setting up the online store the most important thing: their users! A standard shop is easy to set up, but he is just as the name says - only standard and aligned to the least common multiple that combines online stores. Only by inpiduelle adjustments in structure, design, and user guide gives your shop a inpiduelle note and an extra added value. These are often the little things that can make a big improvement.

The lifeline: Usability and User Experience

To avoid Ranking crashes, it is essential in the future, an online store that offers good usability (usability) and a positive user experience (user experience). Although Google can these "soft factors" Do not check directly, but it pulls measurable factors such as bounce rate and length of stay for advice, allow conclusions on the qualitative level.

8 Steps: How to prepare for the new Google Update

Ideally you hire a professional design agency that assists you in the initial planning of your online store, or optimizing the fly. But even for smaller budgets or for your own implementation there are effective ways:

Be your own customer:

Imagine you were one of your customers to visit your online store for the first time What is the first impression you make on the shop? He seems trustworthy and professional? Do you understand at first glance, what you will find in the shop and why you should buy it in this store and not in the competition? Find yourself in the shop deal and you quickly find what you were looking for? How would you describe the overall feeling that leaves the visit to your online store for you as a customer?

Get feedback:

Ask to visit friends, your online store and give you honest feedback. Of course, you and your loved ones are proud of the business reached, but what counts is not the effort, but the result. Even if it hurts: Enjoy supercritical and objective feedback is the only way you can improve your online store.

Use analysis tools:

Install Google Analytics or another free analysis tool. Follow a regular basis, the number of visitors to your online store, their length of stay, bounce rates, exit pages, etc. Vary text, images and teaser and watch the changes in the numbers. Put different variants of a page and test you which of them the better results achieved (A / B test).

Optimize the interaction between online store and advertising:

Check which target sites link your online marketing efforts. The home page is not necessarily the best solution. If you apply a category or product, then you link directly to it. Set maximum respect of your landing pages to your advertising here. That makes a little more work, but will have a positive effect on the conversion of your visitors.

Look through the eyes of your target audience:

Test the home page of your online store with eye tracking, eg VisuOptiMax. To find out what content your users with what intensity exercise, where they look and if any elements (eg call-to-action) are placed eye-catching enough. After the start, you can test the next one exemplary category page and a product details page in order to optimize your online store step by step.

Provide relevant content:

Perform a blog related to your offer to provide information to your visitors and to increase the visit time on your side The example of a wine-shops were interesting posts such as the notion of winemakers, new wines in your assortment, your personal description and evaluation of individual wines.

Provide inpiduelle content:

Be inpiduelle texts. 1:1 Do not copy the text of your suppliers. Google never liked that and makes your shop for visitors interchangeable. If you have a good feel for the language, write your own texts in your own tonality or hire a copywriter. To cancel your shop positively from the crowd.

Learn from others:

Visit regularly the online shop your competition and selected suppliers from other industries. What are you talking in these shops at special? You can also embed it in your shop?

The most important thing at the end:

Stay on the ball! Search engine optimization (SEO) is a process and lives of continuity. That will not change with the new Google update. The advantage of the user-centric optimization is that it is easier to implement as intuitive as compared to the earlier more technically oriented SEO. It is also sustainable and focused on the purpose for which your online store is actually created, namely, for your customers!

Posted in , , | Leave a comment

Off Page optimization

Online SEO Tutorial | Articlezeneu

The off page optimization is the equivalent of onpage optimization . It is concerned with the reputation of a page, expressed largely through their links with other websites on the Internet.

For a successful search engine optimization both onpage and off-page optimization are important and should be handled with the same care. In reality, however, has shown that the optimization Off Page outweighs the onpage optimization . This can be explained by the fact that the onpage optimization is limited in its possibilities and the variety of web pages is not a sufficient criterion more accurate discrimination for example. The Off Page optimization, however, represents the reputation of a website, which in turn increases with each incoming backlink and upwards there are no limits in principle.

The figure shown below visualizes a section networked sites.

Off Page optimization


As with the onpage optimization there are different factors that can be initially divided into two groups:


Since 1 December 2010, there is also the official confirmation of another group of factors influencing the Off Page optimization. This was in an interview by Danny Sullivan announced with representatives of the search engines Google and Bing. It is the influence of social media (especially Twitter and Facebook) . But since its influence known only recently and is (still) very low, there is insufficient data or information.

A key concept that often falls on the subject Off Page optimization is link building or link building. Thus, the specific set of backlinks to improve Off Page optimization (and thus also of the general rankings of a website) meant. Since link building is one of the main contents of the Off Page optimization is, my second research work related to link building in theory and practice.

Off Page optimization - Takeaways

Finally, once the most important facts about Off Page optimization in a nutshell:

  • Off Page optimization refers to the exploitation of Optimierungspotezialen regarding the reputation of a website
  • The most important factor in Off Page optimization are the backlinks, and there are various quantitative and qualitative differences
  • In the context of optimization Off Page the topic of Link Building is an essential component

Posted in , , , | Leave a comment

Qualitative factors


Online SEO Tutorial | Articlezeneu

Not every backlink has the same quality. Even with the original PageRank algorithm backlinks from websites with a high PageRank were rated higher than those with low PageRank. But since even the PageRank is still relatively easy to manipulate, some other factors are still sought.

Trust Rank

The Trust Rank is a term introduced by Yahoo algorithm based on a manual selection of good websites. These assigned a so-called TrustScore they turn pass on Linked Sites. This is especially the search engine spam are limited.

Properties of Backlinks

Backlinks have a different power. This power is made up of several factors that affect the page on which the link is placed. These include PageRank, TrustRank, link popularity, etc.

Posted in , , , | Leave a comment

Properties of Backlinks



Online SEO Tutorial | Articlezeneu

The quality of backlinks is determined by various factors. These include, firstly, the presented figures, because a website for example, a high IP popularity has is from search engines rated higher. Accordingly include links from these sites more. The same applies to PageRank and TrustRank . But there are other factors that directly relate to that side on which a backlink is. These are presented below.

Google knows the link-giving website

A backlink can only have a positive effect, if Google also of the existence of this link has knowledge . This can be checked, for example, if the URL to look on the page giving the link is to reach out to Google to enter it. The URL should now show up in search engine results. For a greater number of links can be tested to the Google index checker are used.

However, this procedure can not always be applied. An example of this is the noindex meta tag . Although this prevents the inclusion in the Google index, but does not cause a back link from this page will not be counted.

Link Method
The conventional method for generating a hyperlink is the use of <a> tag which carries the destination URL as its value href attribute. Using this method ensures the full value of this backlinks. So this link is one, it must not, however, with the nofollow attribute to be equipped. 
There are also other methods to realize a hyperlink. This includes, for example, a forwarding command using the JavaScript window.location or the link in a HTML image map . Google also keeps track of these types of "links", but it is not known whether this also be interpreted as equivalent to <a> tags. 
Some websites use a special script for masking outbound links. In each case, reference to a side internal address where a parameter (for example, the URL of the target page) is passed. The script now uses a redirect (HTTP status code 3xx) to pass ultimately to the desired page. As already 301 redirect was explained, these redirects are always subject to some penalty, which would reduce the value of such backlinks.

Anchor Text

The anchor text has also been introduced and explained. For search engines, it represents a hyperlinked page for additional information and should consist of the desired target word.

In general, one assumes, however, that in this area there is a filter of Google is to oversee the distribution of different anchor texts. The basic idea behind this filter is the recognition of an unnatural linking , which in turn could indicate a manipulation. For practical use this means that the anchor texts vary to some extent should be. Thus, the complete domain name is often used as the link text for example. Another example is the use of the full name in which to leave comments on blog posts.

The use of the Domain Names is yet another reason makes sense, because the so-called Fire update in September 2009 (as Vince update known) has increased the visibility of brands. With reference to the set link text as brands are often linked with their name. More information and facts to be found at Media donis: The Case of the fire.

Link Title

The title attribute is one of the so-called Universal HTML attributes and can be used as it is in almost all HTML tags. On self html the title attribute is described as follows:

[The title attribute] allows to equip HTML elements with relevant comments or meta-information. The commenting text is standardly by the browser in a small window ("tool tip") or displayed in the status bar when the user moves the mouse over the display area of ​​the HTML element.

With the help of this attribute is a back link that is in addition to more information on the linked page will be given to the anchor text. Although there is no official statement by Google that the text contained in this attribute has an effect on the ranking, but it would be only logical that this would be the case.

Content relevance

The original idea of the Random Surfer model was the adoption of a user navigates through hyperlinks from a web page to the next and can thereby determine the coincidence which link is followed in practice. The Reasonable Surfer extends this model to a probability component that links certain weighted more heavily than others. The logical extension of this idea is an appreciation of backlinks that are in a context relevant to the topic and provide a user, for example, further information is available. That Google is able to establish relationships between different concepts, show various services such as the display of related terms, which is exemplified in the figure below. Even if this is likely to be a stochastic model, it is at least the facility having Google, in this context.

Backlinks from a relevant context may also contribute to the recognition of manipulative (as purchased) backlinks. In a large proportion of subjects irrelevant to backlinks , for example, a flag will be set automatically, which triggers either a human or a control Ranking punishment provides the relevant page, or even the entire domain.

Link placement

Also from the Reasonable Surfer model shows that the placement of backlinks to a website (header, content, footer, etc.) has an impact on its weight.

Number of external and internal links

The number of internal and external links which are on one side, reduce the value of each link . This is a direct result of the Page Rank algorithm. Google itself called as a guide max. 100 links per page. This figure is, however, historically and is mainly due to the fact that Google has indexed only 100kb at the beginning of a website.

Generally, it is believed that internal links in this context not as strong as External fall into the weight , as it is common in many websites, for example, a navigation display on each page, so that almost always are some internal links on each page . However, a crucial link page, it is still a positive criterion if this as few outbound links.

Posted in , , , | Leave a comment

Trust Rank


Online SEO Tutorial | Articlezeneu

p align="justify">A concept that nowadays a lot of importance is attached, the Trust Rank. Trust means the trustworthiness of a page with respect to the information provided in this context and their resistance to webspam.
Search engines have a vested interest, only users relevant to their search results tailored to deliver. They are constantly confronted with the problem of aggressive online marketing methods exposed to, for example, through automated webspam comment.

Disambiguation

Before the Trust Ranks discusses the implementation of the first term must be clearly identified. In general, is in Combating Web Spam with Trust Rank algorithm presented meant when Trust Rank is mentioned. One of the co-authors of this paper was Jan Pedersen, a Yahoo! employee a year later the patent application link-based spam detection submitted. therefore It is not in the Trust Rank to a patent filed by Google. It is likely, however, that Google uses a very similar principle to the Trust Rank. Caused some confusion in this regard, the fact that Google is almost at the same time held the trademark on the term "Trust Rank", but that an anti-phishing filter called. See also the following video from Matt Cutts :



Algorithm

The basic idea of the Trust Ranks is the division into good and bad websites. In good websites mean those that are regularly maintained and monitored their content. It is not differentiated from these pages within a domain. Good websites are also notable in that it with very low probability on bad sites link, but with a high probability of high-quality pages. Bad websites are spam pages that follow as illegal or fraudulent or exist solely for the purpose of search engine manipulation. The Trust Rank now represents a criterion reference to which good and bad websites can be identifed:

  • Good website> high Trust Rank
  • "Bad / Spammy" Website> Low Trust Rank
The problem at this point is the inability to make the distinction completely automated in good and bad sides. Therefore, the Trust Rank algorithm is based on a so-called oracle function , in which a human author makes this distinction. Since it is impossible for an ever-growing number of websites, each website individually equip with a value for the Trust Rank, which is a PageRank -like algorithm used inheritance principle. This first one is automated pre-selection of sites hit that should have as many good websites. For this pre-selection, for example, the PageRank serve as a selection criterion. The selected sites are categorized as seed and get called by a man known as a Trust score assigned. This TrustScore is essentially the value of the Trust ranks and then as the PageRank passed on linked websites. Through the above-mentioned property of good websites, with only a very low probability to bad, but a high probability of good websites to link the Trust Rank but provides greater security against tampering as the PageRank. However, since it is not assumed that all linked pages are also the same content monitoring and care subject as the original seed, is a damping factor in the inheritance of Trust Ranks used. The following figure verdeulicht this statement:


The Concept of trust rank

The Trust Rank - Conclusion

The Trust Rank is an effective approach to minimize proliferation of spam in search engine results. Furthermore, it can also be used as a ranking factor, as it will be just like the PageRank iteratively and can assign a value to all pages of the Internet, which in turn creates a metric that allows a comparison of different websites. A high Trust Rank may therefore be a better ranking result. This can be achieved by being linked to from a page with high trust.

The problem with the use of the Trust Rank algorithm is the one choosing the right seed to other websites and the calibration of the different parameters (such as the damping factor).

Finally, once the most important facts about Trust Rank in brief:

  • The Trust Rank is used to distinguish between good and bad (in the sense of spammy) sites
  • The algorithm for calculating the Trust Ranks based on a manullen selection of trusted sites, one of which is passed on from Trust
  • Analogous to PageRank, there is also the Trust Rank the principle of inheritance muted
  • The Trust Rank may not be displayed, but it is likely that he has an influence on the ranking

Posted in , , , | Leave a comment

Quantitative factors


Online SEO Tutorial | Articlezeneu

Quantitative factors are based on the unweighted number of backlinks to a site. The original PageRank algorithm is the best example for this kind of factors. There are three key metrics in this area:


  • Link Popularity
  • Domain Popularity
  • IP Popularity

Using these figures to one for an entire domain , on the other, but also for individual sub-pages of a domain . Figures for entire domain resulting from the sum of the individual values ​​of the indicators all pages including the home page. In the following explanations is to simplify each of the term "web" is used.


Link Popularity


The term link popularity refers to the number of all the backlinks that point to a website. True to the motto: "Many links mean a high reputation" applies here: "The larger this number is, the better it is for the ranking of a website."


However, this figure is not considered particularly meaningful as a so-called site-wide links is now nothing unusual. A concrete example is the so-called blog Rolles. Here, a blogger linked, for example, a friend or other topics relevant blog of the navigation of his own blogs. This is the link to appear every bottom of his blog . Each of these links flows in link popularity with a. The origin of thought, a link is a special recommendation for further information diminishing role , because obviously the set in the blogroll link is not specific to the contents of a base.


Domain Popularity


The domain popularity is a measure that indicates the number of domains linked to a website. It does not matter how many times this site was linked from the pages of a domain. In order to defuse the problem of linking each base significantly. Also for the domain popularity is: the higher the ratio the more positive the impact on the ranking.


IP Popularity


Verschärftere a form of domain popularity is the IP Popularity dar. This measure all IP addresses are counted, from which links are placed on a website. This number was due to the fact that most web hosts a certain, limited IP range have available and make these available for different domains. It is possible that different domains are hosted on the same IP address.


Could be to increase the popularity of a website domain you now get the idea, simply register a number of different domains at a hoster and from these to the target site to link to. This would obviously manipulating the search engine algorithms take place because the set would link primary goal is to improve the ranking of the target website and thus does not actually possess value in terms of reputation increase.


The IP popularity there are in various forms. The version presented above is the least restrictive, since it merely requires that the IP addresses differ in any bit . In other variants, for example, a complete class C network (the first three octets of an IPv4 address) is chosen as the basis. However, in each variant also applies here, the principle that a high IP popularity has a positive impact on the ranking.


Conclusion


The quantitative factors of optimization Off Page give a rough overview of the link level or the popularity of a website on the Internet. The problem with these figures is the danger of manipulation. Although the manipulation attempts are hampered by restrictive perspectives, but by no means ineffective. There are, for example, already offers special hosting , domains which are hosted on different IPs of hundreds.


Another problem with these figures is that they unjustified discrimination may result. The simplest example of this are two different domains, the report on a common theme and are often mutually link - for example, because the other domain provides further information. In this case, that would be IP or domain popularity too restrictive , since it would be appropriate to evaluate each link, even if it comes from the same domain.


Because of these problems, one can assume that these metrics a first estimate of the Off Page optimization provide a website, but still not be the measure of all things.

Posted in , , , | Leave a comment

Syntactic distinction


Online SEO Tutorial | Articlezeneu

This chapter discusses the use of HTML syntax to optimize the text on a web page. Already in The Anatomy of a Large-Scale Hypertextual Web Search Engine greater importance are excellent words separately metered and Google has continually evolved since that time.


Validity

It is irrelevant to the ranking, if a website is the W3C validation criteria or not.


HTML markup

HTML offers a wealth of opportunities to distinguish a text semantically. These include, for example, headings, lists, and bold or italic text passages. This award is a weighting of different keywords are induced.


Multimedia content

Images, videos and music, for example, fall into the category of multimedia. There are partially own search for this content. Besides optimizing for these there is also the theory that the use of images also have a positive impact on the organic search results has.

Posted in , , | Leave a comment

Multimedia content


Online SEO Tutorial | Articlezeneu

Search engines generally have problems in understanding of multimedia content such as videos, photos or music files. You are here in part to the surrounding text and rely on the other to the use of the alt attribute . This attribute describes the content to be displayed and is required for example by screen readers.

Furthermore, carrying filename of the image in information retrieval. The optimization of these factors influenced definitely position a graphic in the Google image search and contributes according to Mario Fischer (Website Boosting 2.0, p 325) also also contribute to a better ranking that page that displays this image (if it fits thematically)

Posted in , , | Leave a comment

HTML markup


Online SEO Tutorial | Articlezeneu

HTML is a markup language for textual content . The award is undertaken through tags that are expressing a certain semantic meaning and rendered in a corresponding way of browser so that this importance is manifested visually for a visitor . About this award so it is possible that certain content be assigned semantics.

There is no official statement on what day how much weight is attached and you can assume that there are no simple mathematical relationships, but that make up a positive value for the ranking of different factors. It is not enough, all the keywords to highlight the text in bold for example, or even as a heading. Such techniques are considered rather as spam. The following are some examples of tags that have an impact on the ranking.

Headlines

Headings are identified in the <Hx> HTML tags, where x is the level of the heading. <h1> tags identify the main headings. The keyword or keyword combination should show up in any case within a <Hx> tags. However, one can make a clear statement as to whether Google uses different shades hour day in their natural order as ranking factors. Against this assumption, for example, that the world's most popular blogging system WordPress is the title of new posts by default in <h2> place in <h1> tags.

Some time ago it was also fashionable to use <h1> tags as a replacement for <b> or <strong> tags to benefit from the "header bonus". This is of course not in the sense of the user in terms of Google and still is considered rather as spam. Nevertheless, it is ok to have more than one <h1> day to use as Matt Cutts explained in the following screencast:






However, it is relatively safe to say that words in titles revert to a higher importance because they clearly stand out from the text. eye tracking analyzes have also shown that visitors mainly read the headlines and skim over the rest of the text. So it makes perfect sense to ascribe greater importance to these structuring elements.

Lists

Lists are also structuring elements and fall when viewing a web page in the eye. In Document ranking based on semantic distance between terms in a document different approaches and algorithms are presented which are intended to identify the semantic proximity of words in a text. One of these approaches is here that lists must be treated separately, as they could for example make a value-free bullets. For example, the data shown in the diagram below as equivalent with respect to the semantic distance on the titles to consider, although the last point considered purely localized further away than the first.


Example of semantic proximity of list items



Bold, italic, etc.

Words that are marked in a manner within a continuous text that set it apart from the rest of the text stand out a visitor reading the eye. So these words seem to carry more significance than the non-selected words. At this point, the term "marker" but to widely used because it is possible through the introduction of CSS to create these visual effects, without using the provided HTML tags. A search engine can not understand this then highlighting.

Detectable influence according to Mario Fischer at least the award in bold or by <b> <strong> tags (Website Boosting 2.0, p 320). It is likely that the same for italics (or <i> <em> tags) and underlined text (<u> day) applies. <b> and <strong> and <i> or <em> tags are considered equivalent by the way, like Matt Cutts confirmed in the following screencast (from 0:48):

From the perspective of usability , it makes sense to highlight particularly the most important statements in a text so that a reader has only to skim the text if need to understand the core message.

Conclusion

What does that exploits in practice? The above examples and illustrations are intended only as an indication that Google assesses the HTML syntax of a text. For practical use, it lends itself to so that keywords and semantically related terms prepare accordingly.

Posted in , , , | Leave a comment

Validity


Online SEO Tutorial | Articlezeneu

Contrary to some rumors, the validity of web pages according to the W3C a criterion that is used in the ranking of Google. 's Matt Cutts confirmed this in an appropriate screencast.



A majority of sites on the net does not contain a valid source , be it to support old browsers or because the validity in everyday life no importance is attached to. In general it can, however, of the validity of the source code is not close to the quality of the content and most current browsers provide even non-valid code correctly represents the consideration would lead to content that is potentially worse results because invalid websites would be disadvantaged.

Posted in , , | Leave a comment

Content - The content of a website


Online SEO Tutorial | Articlezeneu

The content for the search engines most interesting part of a Web page because it potentially contains the most information. One of the most common recommendations for search engine optimization (and one of the principles from Matt Cutts ) reads as follows: "Produce good quality content that is made ​​for people, then comes the success of search engines alone."


This statement may have been true in the early days of the Internet in this form, but with the development of Web 2.0 as no longer tenable. This is due to the fact that there are hardly a topic on which there is no information on the internet and secondly to the fact that the quality of content alone does not affect the ranking. It was the recommendation in the form of hyperlinks makes the content of a Web page for Google high quality. In addition, there are other factors that must be considered in the creation of content, because they have influence on the ranking.


A quote from Matt Cutts in ? How does Google collect and rank results provides an interesting approach, which serves as a general guide for the preparation of search engine optimized content:


Did fake you are a programme. decide a question like warfare or utilization or no matter you would like. seek for the phrase on Google, decide 3 or four pages from the results, and print them out. On every output signal, the individual words from your question notice (: like "civil" and "was") and use a highlighter to mark every word with color. do this for every of the 3-5 did you print out documents. currently Those documents tape on a wall, step back many feet, and squint your eyes. If you probably did not apprehend what the remainder of a page aforesaid, and solely may choose by the coloured words, that document does one suppose would be most relevant? Is there something did would create a document look a lot of relevant to you? Is it higher to possess the words be in {an exceedingly|in a very} giant or heading to an occur many times in a very smaller font? does one like it if the words square measure at the highest or the lowest of the page? however usually do the words got to appear? See if you'll come back up with 2-3 belongings you would hunt for to check if a document matched a question well. will|this will|this could|this may} facilitate students learn judge|to guage|to judge} connectedness web site the manner an enquiry engine would evaluate it thus thatthey can higher perceive why an enquiry engine returns sure results over others.


Keyword Density

The Keyword Density is a measure of the "density" of keywords within a web page. This figure is, however, be treated with caution.


Keyword Proximity and Keyword Positioning

Already in the early days of Google's proximity query ("proximity") was included in the ranking. Positioning refers to the naming in different parts of the semantic web .


Internal and external links

Linking to internal and external pages is important for two reasons. For one, the idea is to optimize the internal linking structures to ensure optimal PageRank flow to allow for other relevant links are included in the ranking (keywords: Good and Bad Neighborhood).


Semantic Relevance

The recognition of the semantics is the supreme art of information retrieval and the goal of any search engine. When a search engine (ie, its "sense"), knows the semantics of a text, they may search queries answered more focused and a user.

Posted in , , , | Leave a comment

Semantic Relevance


Online SEO Tutorial | Articlezeneu

A major problem in the assessment of documents is the recognition of the topic that describes this document. This problem is evident if one takes, for example, words with multiple meanings (so-called "Teekesselchen") as an example. Without additional information, a search engine does not distinguish between the importance attached a word on a page.

Phrase Matching

To work around this problem probably used Google algorithms that calculate the semantic proximity of words. A clue can be found, for example, Automatic taxonomy generation in search results using phrases . Here a method is presented in which groups of words associated with a keyword or combination of keywords will be investigated. Applying this method to the corresponding number of documents, as these documents can be put together on the basis of groups of words in clusters and thereby allow a categorization of the content.

LDA - deferred Semantic Allocation

A similar approach is also known as Latent Dirichlet Allocation (LDA). These systems are quite possible at the present time, which, for example, in the online course information retrieval and text mining is demonstrated.

LSI - Deferred Semantic Indexing

Often one reads of a so-called latent semantic indexing (LSI), which amounts to a similar result. 

Conclusion

Finally, care should be taken in any case that there are various methods already, by means of which the semantic content of documents can also be seen in some way. This data may be used by Google so well for the ranking. SeoBook.com provides some practical examples of how this knowledge as SEO can take advantage of. For a search engine optimized text which generally means that not only the mention of the keyword to the ranking of importance, but also the context of the word is illustrated by semantically similar terms, so that there is more relevant to a search engine.

Posted in , , | Leave a comment

Internal and external links


Online SEO Tutorial | Articlezeneu

By building a website on a specific topic, it may well be that overlapping sub-themes on different pages are handled. In this context, you should definitely use of the internal linking of your website make. This helps to a human user, because he can further inform an adjoining topic, and the other has a positive effect on the ranking in search engines (for both bases).

For the bottom, which is angelinkt, there is an advantage, because in this way you link power flows. The positive effect is therefore directly related to the well-known PageRank principle linked. The maximum efficiency is obtained in this case if the link text , the keyword is selected for which the linked page should have grown. But the bottom on which the link is placed is positively influenced the rankings, because Google also evaluates the outgoing links of a website. One speaks generally of Good Neighborhood and Bad Neighborhood .

Good Neighborhood

Under the "good neighbor" refers to linked websites that are characterized by positive properties. One can evaluate the "positive" make it in two ways. First, you can decide as a person, whether a website quality information contains and these are processed user-friendly. On the other hand one can be based on an assessment of the search engines. This is reflected in the rankings. So one looks at the topic that you just reported, for more information on Google, so the offer SERPs a good indication of what Google considers a "good neighbor".

Bad Neighborhood

A website can be adversely affected if their on against Google's Webmaster Guidelines are violated. This can, for example, the intentional concealment of text or through the use of techniques spam be the case. The same applies to infected websites . These so-called bath sites are demoted in ranking for oneself and at the same time have a negative effect on the ranking of sites that link to them.

Conclusion

Google uses the user claims to be always at the center. It generally leads to a good user experience when a web page linked to other relevant information, whereas misplaced links are simply ignored topics to third-party content or either have a negative effect in the worst case.

It should, therefore, use of relevant internal and external links are made, but proceed with caution with respect to external links, bad neighborhoods.

Posted in , , , , | Leave a comment

Keyword Proximity and Keyword Positioning


Online SEO Tutorial | Articlezeneu

For keywords that of more than one keyword are playing nearby (called proximity) and the order of these concepts play a role in the evaluation. The closer together the words are, the higher the relevance. The positioning of keywords refers not about whether the keyword is far above or below the source code of a page, but whether it is in a prominent (for example in the Content) or less prominent place (for example in the footer is located).

Posted in , , , , | 1 Comment

Keyword Density


Online SEO Tutorial | Articlezeneu

In the optimum case, a website treated exactly one keyword or a combination of keyword and is also optimized for that keyword exactly. The frequency with which this keyword in the content of a web page appears, is one of the ranking factors for Google. As a concrete value to this pulls the so-called keyword density to rate the ratio between the number of occurrences of the keyword to the number of all words indicating on the website. There is no officially confirmed by Google value for a good keyword density. But are recommended for example in the book Website Boosting 2.0 by Mario Fischer (p. 311) values ​​between three and four percent. However, the computation of keyword Densitiy is already not without problems, because it is not known whether the complete text (including Boileplate code) is used on a Web page or as handled as stop words. Further information on this issue also provides Karl scratch with SEO Keyword Density myth .

As a target but you should definitely put that on the targeted keyword frequently of all the words (except stop words) occurs (and therefore also has the highest keyword density) so as an incorrect assessment of the theme of the site by search engines to prevent. It should however be noted that the keyword density is not too large, as this could indicate an attempt at manipulation. This is known as keyword stuffing and called contradicts the Google Webmaster Guidelines .

The keyword density is on one of the most important factors in the early days of information retrieval back, called Term Frequency so the frequency of occurrence of the term. If one now normalize the term frequency to a value between 0 and 1, then they would still be divided by the number of all words - and would thus arrived again at the definition of keyword density. All there is is still in the form of TF / IDF, and IDF for "Inverse Document Frequency" and is virtually the "value" of a word generally reduced when it occurs in many documents. Example: The word "not" occurs extremely often and is therefore suitable to classify, even if it is often on a website not a good document.

Due to this fact, it is likely that the keyword density is one of the more important onpage factors. This is supported among others by a series of tests of Sistrix, in which the relationship between density and keyword ranking was investigated. 

Keyword Density - Summary

The Keyword Density is the ratio of a word relative to all other words on a web page. The exact calculation of keyword density is not known, the main reason is that certain special cases such as plurals, compound words, stop words, etc. exist, their use must be determined experimentally. However, is certain that the keyword density has an influence on the ranking.

Finally, once the most important facts about keyword density in brief:

  • Keyword Density = number of keyword / number of words
  • The keyword density has an influence on the ranking
  • An unnaturally high keyword density may result in a penalty. As a guideline, 2 to 4% a good clue

Posted in , , , | Leave a comment

Meta Information


Online SEO Tutorial | Articlezeneu

Search engines do not just assess the actual content of a website but also meta information about a web page. With this additional information, they can better assess the issue at hand on a website.

Domain Age

The age of a domain is equal to from Google's point of view, the date of the first crawl . Younger domains must be first "establish" and are considered "skeptical" up to a certain age.

Domain Name

The domain name is a keyword Supplier of calculating the relevance of a Web page. In this particular play, the terms and exact match domain keyword domain a role.

Title

The title is a homage left corner of the browser window. Moreover, it is usually the link text in the search results of Google , the above snippet is. The title is a very important ranking factor and should be determined specifically for each site and to be unique.

URL

The URL locates a unique website. It is entered in the address bar of the browser. A good URL describes the content of the website, which they identified with the main keywords of this website.

Duplicate Content

Duplicate and duplicate content (often just DC) designated websites that have similar or identical contents provide. Since this is of no value to a user, Google tries such sites as possible from the index stay away or a representative to find representatives.

Meta Tags

The meta tags are defined in an HTML page header and represent typical meta information from HTML pages dar. Google currently using any of these meta tags for ranking itself, but nevertheless evaluates a few from other functions.

Posted in , , , | Leave a comment

Structure and building a home page


Online SEO Tutorial | Articlezeneu

This section is the basic structure of a website addressed. The measures described should be performed as early as possible, even before the actual content of the domain is created. Here, since the foundation of the home is placed, any errors in hindsight may be difficult to revise.

Semantic structure of HTML pages

HTML pages often have a structure typical in some areas such as header, content and footer. These areas contribute to the actual information content of a single page with strongly different. This is probably of search engines included.

robots.txt

The file robots.txt is stored in the root directory of a domain and provides instructions for web crawlers respect the permission or the prohibition against crawling different sub-pages and directories.

Sitemap

A sitemap is generally a structured overview of the contents of a domain dar. there for a special Google sitemap, which must follow a specific format. Through this Sitemap Googlebot can be supported when crawling.

Page Rank Sculpting

The Page Rank sculpting covers more on the optimization of internal PageRank flow and generally has the goal of PageRank to minimize unimportant pages.

User Experience

Besides the Google content attracts other signals for the ranking approach. This includes, for example, the usability of a page, or the general impression made ??by a user on this site.

Posted in , , | Leave a comment

On page optimization


Online SEO Tutorial | Articlezeneu

The term On page optimization one summarizes all measures that can be carried out on a web page itself around in search engines to be listed for a better term. Besides the pure mention of the search term are behind the On page optimization a number of other factors which are described below.

The structure and form of HTML pages


The structure of a website should be thought through before their creation. What content is important? What subpages result from this? How can I categorize these pages useful? The answers to these questions are essential for On page optimization. Based on this, must continue to be decided, such as which pages are linked to each other and how this can be communicated to Google.

Meta Information


Meta information provide more information about the actual statement of a website. This meta information are varied and offer a large potential for optimizing the On page optimization, search engine because they give a clear indication about the treated subject.

Content - The content of a website


The content of a website is the goal of the search engines. It provides the actual information and is the most interesting for the visitors. Through various methods of information retrieval search engines try to categorize the contents of a website for weight.

Syntactic distinction


The vast majority of today's web pages based on HTML. This text markup language provides many ways to structure a text, and particularly distinguish certain parts. For search engines this is the factor On page optimization also a help in the identification of relevant concepts within.

On page optimization - Summary


The On page optimization deals with the elements (text, markup, internal linking, etc.) on a web page. Thus, it represents the counterpart to OffPage optimization dar. In general, the influence of On page optimization is subordinated to the OffPage optimization. It must not be forgotten that the source code of a web page, the first point of contact with the crawler is a search engine. many websites give away at the On page optimization potential , because the keywords are not strong enough anviesierten focused.

Even without the actual meaning of the text to understand, one can clearly see that there are provided information on "On page optimization".

Finally, once the most important facts about On page optimization in a nutshell:

On page optimization describes the exploitation of potential for optimization directly on a web page
By On page optimization of a search engine can be told the topic of a website
On page optimization should go hand in hand with the usability. This means in particular that Web pages are mainly for users to read and should be operated

Posted in , , , | Leave a comment

SEO - Basics


Online SEO Tutorial | Articlezeneu

Larry Page, one of Google's founders, described the perfect search engine as something that understands exactly what you are looking for and also exactly the supplies as a result. To meet this demand has developed a technology Google, which is based on the following three components:

The finishing touch is brought by the introduction of a ranking, as identified by the relevant documents to the user in a convenient order. The foundations for this are original ranking criteria summarized.

Crawl

Google uses so-called web crawlers (also often crawler or spider gennant) to find a website. The Google crawler called Googlebot. In general, any Web sites are not randomly accessed, but the crawler works systematically through the website links . Of a retrieved Web page hyperlinks are extracted and stored in a queue. This queue is then gradually processed. To conserve resources, however compared to previously which web pages the crawler has already retrieved.

At the present time two different crawling processes can be distinguished, the deep crawling and Fresh-crawling . Here the Deep Crawling is the process explained above, while the fresh-crawling is responsible for the timeliness of the retrieved pages. In this case, well-known sites are already re-crawled to the latest changes to recognize it.

The results of the crawling is passed to the so-called indexers, which is explained below.

Indexing

The pure collecting sites initially offer nothing but the archiving of information. However, the main purpose of search engines is searching (and finding) of documents. Since the duration of this process with an increasing number of documents also increases, a technique must be found to make this process as efficient as possible. For this reason, Google places for each crawled web page to an index consisting of the individual words of the document. The index associated with a word document and can be searched in parallel from diversified servers. This type of standard is also mentioned to as "inverted index".

The index itself is seeking requests for optimized (for example, only words are stored in lowercase and in alphabetical order). The efficient application of this method allows Google to answer queries in a fraction of a second, although theoretically recognized several billion web pages would have to be searched.

Query Processing

The query processing provides the interface to the users of the Google search engine represents a entered a search term ends quantity is processed by Google and sent to the database. The preparation includes, for example, the removal of stop words.

The request to the index database now delivers all documents that contain the search terms. This document set is also referred to as "posting list". The real power lies in this posting to list sorted so that it displays the most relevant results at the beginning. This is Google more than 200 rating factors a, on the one hand, the relevance and on the other the reputation evaluate a page. The results are what you generally summarized under the term SERP and which manifests itself in the display prepared for the searching user.

Original ranking criteria

Some of Google's ranking factors are in The Anatomy of a Large-Scale Hypertextual Web Search Engine described. These are explained below and it will be an evaluation in relation to the current relevance of these factors were made. The factors are:

This fall PageRank and anchor text in the field of OffPage optimization , while the "Other Features" on the field of onpage optimization relate.

Other features

Under "Other Features" are mentioned in the original version of Google 1998 Factors "keyword proximity" and "HTML markup". Under keyword proxmity is understood to be close to each other, the search terms within a document. The index is compared to the source code of the first search term with the other terms. HTML markup refers to the syntactic markup such as font size and color.

Posted in , , | Leave a comment

Meta Tags


Online SEO Tutorial | Articlezeneu

Meta tags are defined in section of an HTML page and give extra information about the page. Google a few of the possible meta tags have a meaning, as in "meta tags can be browse."


Meta Description


The Description Meta Tag is intended as a brief summary of the contents of a page and should be unique for each page of a domain. This day has, according to the Google Webmaster Central blog post Google does not use the keywords meta tag in web ranking does not affect the ranking of a website:


[...] Even though we sometimes BACKGROUND OF THE description meta tag for the snippets we show, we still do not the ranking of BACKGROUND OF THE meta tag description. [...]


However, it is possibly for the generation of snippets used. However, there is no guarantee, because Google adjusts the snippet of the query.


Meta Keywords


The keywords meta tag does not affect the ranking of a website in the SERPs and has otherwise no significance for Google. It is mentioned here only because it is a common myth that this day can affect the ranking of a page. There are several screencasts from Matt Cutts that refute that.




Robots


The robots meta tag controls the behavior of search engines on the page on which it is placed. It can take the following values:


noindex -> Prevents indexing a page
nofollow -> Prevents hyperlinks for more Crawlingvorgänge be used
nosnippet -> Prevents the display of snippets
noodp -> Prevents the description text of the ODPOpen Directory (if present) is used as a snippet
noarchive -> Prevents Google keeps a cached version of this page
unavailable_after: [date] -> Prevents from crawling specified by [date] date
noimageindex -> Prevents the pictures on this page show up in Google Image Search


None of these values ??has influence on the ranking. If the date is not set, it is assumed by default to the value "index, follow". This indicates that the page should be indexed and that the linked sites may be crawled.


Refresh


The refresh meta tag causes a redirect to Google as a 301 redirect is handled. The day has however been deprecated by the W3C and should no longer be used.


Dublin Core standard


The Dublin Core standard is another method available to provide an HTML document with meta information. It has some advantages over conventional meta tags through the standardization of certain values. Regarding the search engine optimization but it does not bring any advantages to meta-data according to the Dublin Core standard uses. This was, for example, 2005 by www.webology.ir investigated.

Posted in , , , | Leave a comment
Powered by Blogger.