![]() | |
![]() | |
![]() |
SEO Information |
|
![]() |
Search Engine Optimisation: The Soon to be Impossible Dream!
There are today search engine and internet marketing services, in fact a new industry has materialised to exploit the fear of low search rankings. This is not a new trend, back when simply resubmitting your website to the engines resulted in keeping your site at the top of the index, there was an accompanying boom in resubmitting "companies", as we know, these were just men in back bedrooms with a host of CGI and Perl submitting scripts and a timetable. Search Engine optimisation or "SEO", is the latest incarnation of this bedroom profiteering, the important difference is that now the webmaster's are not just passively involved but are being forced to adopt totally artificial and unsocial practices that ultimately serve only to help damage the Internet! SEO is supposedly the methodology and processes related to designing search engine "friendly" web content, the basic premise is something like "If I follow all the engines formatting and connectivity criteria, then my website will rank higher then a comparable website that does not". All other things being equal, this seems quite positive given that the quality of a search engines database (index) directly effects its output; then webmaster's optimising their content so that search engines can correctly categorise the internet should logically improve the speed and quality of "the crawl". SEO then, logically, should be good for the search providers, being able to maintain an efficient index, this should use less raw processing power, require less equipment and thus less energy; this must also be good for the users, being able to quickly and intuitively find what they want from a reliable source. Sounds reasonable right? Well that's the happy version. The fact is that initially this may be true, you may gain a short term advantage, but once we have all optimised our content for analysis and (in so doing) ignored our users; We will then be back to where we started, and the search providers will just think up some even more ridiculous "laws" by which to "judge" us by, and like sheep we will all do that as well, thus the causal paradox is perpetuated and the users feel abused! Even this is a vast oversimplification, the true nature of SEO is a lot more complicated; The heart of the problem and the real issue here is related to the search providers task, which is to strip mine the information junk yard otherwise known as the Internet, it may be full of interesting stuff but also plenty of garbage and they need to devise intelligent techniques to mine the interesting stuff! The current "solution" is literally for the search engines to use their hegemonic standing to bully the webmaster's into organising their work in ways that have the primary effect of allowing quick "analysis" so they can categorise the website, but this has the secondary effect of requiring content to be designed "for" analysis, which typically translates to highly distributed connectivity, ie the website being effectively divided into "micro sites", which makes the maintenance of links and content more troublesome! This is not necessarily a bad thing, most of these imposed linking and design methodologies are often positive and beneficial for a lot of subjects. My problem is that this is unilaterally enforced and it is this type of issue that is generating all the money for the SEO boys. However this will soon be of no consequence. To understand the problem with this type of SEO operation, it is necessary to think about how we can approximate and simulate the human process of mining information and knowledge. Let us assume we have set our Crawlers to work, automatically indexing pages (at random, looking at previous indexing and guided by user requests); we then format the resulting text: ASCII is usually used and validation follows this, search engines tend to ignore some tags and make use of good ones that help identify the content. At this point we would have reduced the Internet to a corpora, ie the collection of all HTML documents about no particular subject. We then would set about item normalisation, ie identification of tokens (words), characterisation of tokens (tagging meaning to words), and finally running stemming algorithms to remove suffixes (and/or prefixes) to derive the final database of terms; this can be efficiently and compactly represented in lower term dimensional spaces, (Goggle are still essentially using inverted file structures). Imagine each document of a corpus as a point ie a term in an N dimensional space, here the literal word matching type search is lost, but we acquire more of a semantic flavour, where closely related information can be grouped in to clusters of documents bearing similarities, however N dimensional vector spaces are of no help to the users. After applying our algorithms to the corpora, we get a term by document matrix, where terms and documents are represented by vectors, a query can also be represented by a vector. So we have a query and our corpora (represented as vectors, both having the same dimensions), we can now start matching the query against all the available documents using the cosine angle between these two vectors. But we now have a new artificial "problem"; we know the general answer to the question "which website's best match my search terms", this information now exists in our mathematical object, at a high level of abstraction, ie the cosine angles for all terms against the query vector, this is expressed as a vector corresponding to the sought column and therefore the document we are after, all we need do is present this to the user, right, well.... The issue is that a search engine needs to generate a linear index, ie convert the vectors corresponding to the minimum cosine angles into a human readable format, and until such time as someone thinks of a better way to do it, all engines output lists, like your shopping list, it has a start, a middle and an end, therein lies the problem, how to order the list! The hypothesis seems simple, ordering information that might look chaotic at first, using the fact that closely associated documents tend to be relevant to similar requests. However, the internet (being a scale free network) is so vast that it is not possible to present a chosen feature space that represents the x closest documents to the convergence point in a given cluster from the common Euclidean distance. This is what should then be presented to the user in a more intelligible (semantic) display. The engines could just present the returns as produced by the matching algorithms after decomposition, because the grouping generated using probabilistic/fuzzy patterns directly from the cluster might belong to more than one class, but the strength (degree of membership) value measured on a scale; using probability on a [0,1] interval, is quite adequate. The reason decomposition in singular values works for ordering is related to the fact that the occurrence of two terms (say tomato and potato) is very high is reflected in the term-by-document matrix by showing that only x of the n terms are used very frequently. The idea is that since the term say pepper is used/mentioned very little, then its axis/dimension does not affect much the search space, making it flat and relevant only in the other two dimensions However the engine's demonic creators can't do this because they are still essentially using an inverted file structure, but they still want absolute correctness in their indexes and returned results which means trouble, because this assumes your index is perfect, incapable of being manipulated and that you can somehow order the returns in a meaningful way! So the returned results can't generally represent the documents that match semantically, we now need to account for some subjective quantities, that can not be derived directly from the corpora, they attempt to deal with this by a cocktail of criteria that rank the returns in such a way as its more likely that the "better" results are closer to the top of the list. There are many ways of doing this, the current trend is to use inference about the quality of web sites were possible because such quantities are beyond the direct control of the content creators and the webmaster's. PageRank provides a more sophisticated way of citation counting but this is embodied in the consept of link analysis, using a relative value of importance for a page measured based on the average number of citations per referance item. PageRank is currently one of the main ways to determine who gets into the top of the listings, but soon this will all become irrelevant when the engines stop using inverted file structures, because they can just use the grouping generated using probabilistic/fuzzy patterns resulting from the convergence point in a given cluster from the common Euclidean distance. When the changeover from inverted file structures occurs, there will be two direct consequences: 1) The corpora will be capable of vastly more representative and more detailed data then is Currently possible. 2) The corpora will no longer be indexed as is currently done, they will embody semantic meaning and value, where some subjective quantities can be derived directly from the corpora without the need for cocktails or totally artificial rules. The effect is that corpora will be more accurate and incapable of manipulation, thus variations of SEO that involve indirect manipulation of the index will become pointless overnight. It is worth noting that the search providers are becoming increasingly pessimistic about website promotion in all forms, they currently penalise many things that can effect the results such as duplicated content (which can be perfectly legitimate), and satellite sites, ie one webmaster interlinking seemingly separate but highly relevant website's. They may well start penalising webmaster's that promote their website's through articles they submit for third party distribution, as they do for people that post their sites information to bulletin boards! Being banned from the top search engines can effectively destroy your business, if not directly through loss of visibility then indirectly in that people tend to judge you on weather your are organised enough to be listed ! The criteria are continually changing, as the amoral SOE boys attempt to pervert the resultes, these "laws" are not always clear and there are no appeals, where we are all subject to the providers up ending a drum then dispensing swift and hard "judgements", that can doom us at any time! The part that erks the most is that as the indexes converge, (goggle's index is used directly by 2 of the 3 top engines and 5 others indirectly use it for their rankings) a bann by anyone of these engines is enforced by them all. © I am the website administrator of the Wandle industrial museum (http://www.wandle.org). Established in 1983 by local people determined to ensure that the history of the valley was no longer neglected but enhanced awareness its heritage for the use and benefits of the community.
MORE RESOURCES: Google Answers Whether Audio Versions Of Blog Posts Help SEO Search Engine Journal Now there’s an AI that can generate SEO content Popular Science DeepSeek & SEO: What you need to know Search Engine Land Jisoo and Seo In-guk Star in ‘Boyfriend on Demand’ (WT), A New Rom-Com Series Set in Virtual Reality About Netflix Google Search Ranking Volatility Heats Up February 19 & 20th Search Engine Roundtable Ask An SEO: Is There Any SEO Benefit To Image Geolocation Data? Search Engine Journal Why you need humans, not just AI, to run great SEO campaigns Search Engine Land Woya Digital Launches Complimentary Business Resources to Help Companies Understand Their SEO and Digital Marketing Health Markets Insider Boyfriend on Demand: Blackpink's Jisoo and Seo In-guk Set for Rom-Com Bleeding Cool News Serge Calfa joins Roanoke International Brokers as SEO & MD Reinsurance News Google Says Audio Version Of A Page Does Not Help With Google Rankings Search Engine Roundtable BLACKPINK’s Jisoo And Seo In-guk To Star In Netflix K-Drama Boyfriend On Demand | Hit Channel Hit Channel Jisoo to star in new Netflix rom-com with Seo In-guk Yahoo Lifestyle Singapore How to integrate SEO into your broader marketing strategy Search Engine Land How to prevent PPC from cannibalizing your SEO efforts Search Engine Land Blackpink’s Jisoo & Seo In-guk To Star In Netflix’s ‘Boyfriend On Demand’ K-Comedy Series Deadline SEO Considerations for New Domain Names Practical Ecommerce Blackpink's Jisoo, Seo In-guk to star in upcoming Netflix rom-com The Korea JoongAng Daily Top SEO Tools: My Selection for 2025 CyberNews.com How Optimizing Your Event Website Boosts Attendance, Revenue, and Long-Term Success TSNN Trade Show News Seo Kang-jun makes return with MBC's 'Undercover High School' after military service 코리아타임스 SEO in the age of paywalls: A new study examines best practices in driving subscriptions; plus, media notes Media Nation Lounge Lizard Releases Comprehensive Guide Comparing DeepSeek and ChatGPT for SEO Success in 2025 EIN News Ask An SEO: How Should Ecommerce Stores Deal With The Arrival Of AI Overviews? Search Engine Journal Google On Why You Don't Need To Do Special SEO For Pagination Search Engine Roundtable Top 15 SEO Tools to Improve Your Search Rankings Exploding Topics Automate SEO analysis with Google Sheets, GSC & ChatGPT API Search Engine Land Blackpink’s Jisoo and Seo In-guk to star in Netflix K-drama ‘Boyfriend on Demand’ The Express Tribune SEO Must Solve Its Marketing Problem Forrester 7 reasons why we love SEO Search Engine Land How SEO agencies can earn recurring revenue with site management services Search Engine Land Xbox boss Phil Spencer says games journalism has too much "what do they call it? Search engine optimization or something like that?" Rock Paper Shotgun Is SEO Always Changing? Not Really But Details Do. Search Engine Roundtable How SEO Can Transform Your eCommerce Business BusinessMole Kim Ji Hoon and Seo Ji Hye Cast as Second Leads in Rom-com K-drama Petty Romance with Lee Jung Jae and Seo ... A Koala's Playground Are Lee Si-an and Yuk Jun-seo Still Together After Becoming the Buzziest Couple of 'Single's Inferno' Season 4? MarieClaire.com Think spiderwebs, not funnels for remarkable SEO results Search Engine Land Actor Seo Kang-joon will return to "Undercover High School" after three years of military service.On.. 매ì¼ê²½ì œ Blackpink's Jisoo sparks an AI romance with Seo In-guk in new K-Drama, Boyfriend on Demand Gulf News Actor Seo Kang-joon expressed his feelings about wearing a school uniform in the play.On the 20th, a.. 매ì¼ê²½ì œ Google: Sometimes Doing SEO Can Cause More Problems Search Engine Roundtable SEO today and beyond: 8 ways to balance what works now with what’s to come Search Engine Land Future of SEO: 5 Key SEO Trends (2025 & 2026) Exploding Topics Title tags and SEO: Everything you need to know in 2025 Search Engine Land 130 SEO Statistics Every Marketer Must Know in 2025 Exploding Topics This SEO Secret Weapon Is Just $69 for Life Entrepreneur Google Confirms Alt Text Is Not Primarily An SEO Decision Search Engine Journal Shin Jin-seo, 9 dan of "New Intelligence" in the Nongshim Shin Ramen Cup of "Korea-China-Japan Go Tr.. 매ì¼ê²½ì œ Writesonic Introduces SEO AI Agent that Works 24/7 to Help Businesses Grow their Online Traffic Business Standard Google Shares Insight On SEO For AI Overviews Search Engine Journal Technical SEO post-migration: How to find and fix hidden errors Search Engine Land Google Causes Global SEO Tool Outages Search Engine Journal Celltrion Group Chairman Seo Jeong-jin (67) will deliver a congratulatory speech at Konkuk Universit.. 매ì¼ê²½ì œ How to Use AI for SEO Wins in 2025 Entrepreneur SEO for ChatGPT search: 4 key observations Search Engine Land Producer Choi Jung-in showed his affection for actor Seo Kang-joon.On the 20th, a production present.. 매ì¼ê²½ì œ HubSpot’s SEO collapse: What went wrong and why? Search Engine Land Best WordPress plugins to improve SEO Search Engine Land Google change disrupts SEO tools, resulting in data blackouts Search Engine Land 9 Top AI SEO Tools & How to Use AI for Writing and More Exploding Topics Automated SEO Tools Trend Hunter |
![]() |
![]() |
![]() |
RELATED ARTICLES
Submitting Your Site To The Open Web Directory: Some Dos And Don'ts One of the most important steps in any site's publicity campaign is the submission to the Open Web Directory (http://www.dmoz. Web Site Copy is about More Than Keywords Let's say you are writing a web site to sell beach homes on Vancouver Island, BC, Canada.You look for some good keywords and come up with 'Vancouver Island waterfront property'. Advanced Uses for the Google Algorithm Previously.. Google - PR & Backlinks Here are some observations that I have made recently:1. On several of my own sites, I have some pages with zero page rank despite having links pointing to them PR 5 pages on the same site. Find Best Keywords For Your Site Keyword optimisation is probably the most important thing that you want to concentrate on with regards to search engine optimisation (SEO). Unfortunately, not many people know this, or do enough to optimise their sites' keywords. Pay Per Click Versus "Organic" Search Engine Listings - The Pros and Cons and Best Uses for Each What's the difference?For those who aren't quite clear what the term "natural" or "organic" search engine-listing means, they describe the "editorial" search results on any particular engine. These results are professed to be non-biased - meaning that the engine will not accept money to influence the rankings of any individual sites. Google Sitemaps: 7 Benefits You Cant Ignore Google Sitemaps enables Webmasters to Directly Alert Google to Changes and Additions on a Website and that's just one of 7 Benefits.Telling search engines about new pages or new websites use to be what the submission process was all about. Possibly The Biggest Misconception About Ranking Well In The Search Engines Onpage search engine optimization are things that you can change ON your webpage.* Title tag* Header tags* Bolding, Italicizing, Underlining* Alt image tags* Meta Tags* Etc. Buzzwords vs Effective SEO Keywords Ever see a website that seems to speak a foreign language.. The Real Search Engine Optimization Guide Nowadays, there is so much talk about SEO (search engine optimization) that it has become an industry of its own. Still, 90% of webmasters don`t know how to achieve high search engine positions. How To Measure Search Engine Marketing ROI According to the Search Engine Marketing Professional Organization (SEMPO), advertisers spent $4 billion in 2004 on search marketing programs and are expected to spend 39% more than that this year.Search engine marketing appears to be a great way to advertise but is it right for you and your business? If you are not already employing search engine marketing(SEM) for you business is there a way to forecast the return should you decide to invest in it? Is there a way to measure the results you are getting if you have already invested in SEM?The answer is mostly yes. SEO and Directories If you are a webmaster, then you've probably submitted your website to several directories, you may even run one yourself. There are thousands and thousands of directories out there on the net and they all have their advantages and disadvantages. A Real Example of Search Engine Optimization (SEO) Success The term, Search Engine Optimization (SEO), refers to a set of techniques by which web sites and web pages are constructed for maximum recognition and ranking by search engines such as Google, Yahoo, and MSN Search. Using the right techniques can guarantee top listing positions for keywords and keyphrases that are related to a site. Search Engine Optimization and Submission Tips Bringing visitors to your site from the main Search engines is more of a process than one event. The first step is to make sure your webpages are'Search Engine friendly'. The Google Sitemap For Idiots I don't mind admitting that every time some new fangled idea or piece of technology arrives online, I have a small fit and wonder how long it's going to take me to understand what it is, what it's for and whether I need to use it to stay 'up there'. It's even more frightening when the experts start explaining it and really only serve to confuse the matter when they use their 'techno-speak'. How to Get the Ranking You Always Wanted! Is your web site well ranked (In the top ten search results) in the results? If not, you need to read this and get the ranking you always dreamed of getting with your web site! I will show you how to get your web site a top ten ranking in the search engines with these few easy-to-do steps.Site Design - The web site's structure will decide how the spiders read your web site and the speed of them indexing the web pages. Design A Spider Friendly Site To be successful in the search engines it's important to design your web site with the spiders in mind. Using the latest in web page design is not generally the best way to go. The Importance of Correct HTML Syntax in Search Engine Positoning There is a lot of competition to get good spots in the search engines. Proper html syntax and a clean code will help toward better positoning. You Cannot Hide From the Public Record Search Engines As a search engine optimization specialist, I often run acrosssearch engines of different sorts than most people are aware of.This week I stumbled across a free site that is used by journalists to do background checks and fact checking on sources of news stories. The Sandbox Effect What once many people thought they had a penalty, is now being called the Sandbox Effect and is causing new web sites not to rank very well in the search results of Google, not even for the least competitive phrases. Meaning that a filter is being placed on new web sites and cannot rank very high for most words or phrases for a certain amount of time. ![]() |
home | site map |
© 2006 |