The Problem with Modern SEO Practices

SEO looks like “snake oil”, pedantic bickering and pontification that outside the most foundational instances wastes a lot of time/energy for virtually no/highly uncertain reward/return.

The Fundamental Intelligence Gap

SEO tools overwhelmingly have minimal intelligence, if any. They may be able to form/identify metrics, but they can’t correctly understand/interpret what said metrics about page mean.

For instance, if my page is definitively longer than competition (this is completely intentional and I know it well), an SEO tool cannot interpret the metric understand whether or not the length is acceptable. It just blindly says, “it’s longer, so it’s bad.”

While it can perform differential analysis on keywords used by me and my competitors, but it cannot interpret what it means. It cannot understand whether or not the difference/gap was intentional, how adding said keywords would affect content quality and user experience, and so on.

Tools like SEMRush seem to operate on only one central principle: maximize use of volume and diversity of keywords and maximize backlinks. It cannot understand that total diversity/volume of keywords and backlinks is not directly and strongly correlated to quality of content. If anything, they might very well be inversely correlated. They really are dumb.

These tools persistently push for convergence to competition in a large number of parameters/dimensions. If followed blindly, then this will make everything generic (in the sense of being very similar to competition). This is antithetical to long-term, strong, sustainable SEO/growth strategy, which would virtually unquestionably demand uniqueness (or at least severely discourage chasing competition) in at least some dimension/parameter. Chasing competition becomes a dog chasing its tail: a never-ending completely futile endeavour that just burns resources.

Case Studies: SEO Tools in Action

SEMRush’s “On site page checker”

Here is a list of ideas generated SEMRush’s “On site page checker” for one of my page:

  • “Make your text content more readable. Compared to your rivals, your text-based content is difficult to read and understand. Try to improve your content’s readability.”
  • “Use video content. Your rivals that rank higher than you in the Google top 10 for some of your target keywords are using video content. Try to embed a video on your page.”
  • “Enrich your page content. Compared to your rivals, some related words are not present in your page’s content. Try to enrich your page’s content with the following semantically related words: (list of suggested related keywords)”
  • “Earn links from more sources. Try to acquire backlinks from the following domains: (list of some domains)”
  • “Add internal links that point to this page. No internal links point to this page. Try to add at least one internal link that point to this page.”

Why These Recommendations Fail

The content is academic (notes of a subject). It is intended to be extremely comprehensive and off high-quality as it is for my prep. Of course, its less readable than mass-produced (possibly AI-generated) (marketing) slop and articles. It is not a weakness; it is long but I have structured very well with markdown sections, lists, etc. But, of course, the tool does not understand any of that and just pushes for convergence with competition. Similarly for video and missing keywords: its not usual (marketing) blog/article and I cannot simply converge it onto generic (marketing) blog/article, not without completely defeating its core purpose.

Getting organic backlinks is extremely challenging/hard. Asking certain specific domains to provide feature my website and provide backlinks to my website is not scalable, certainly not over long-term. As search engines evolve and become more intelligent, they are, or will be learn to differentiate between a backlink in a generic mass-produced slop that has no substance or value to anyone and backlink by a user or high-quality domain or post.

Regarding internal links, I have added internal links to that page, but somehow the tool does not see them. This minor point aside, the broader lesson is not to take such automated suggestions too seriously without critical evaluation.

SEMRush “Site Audit” Warnings

Similarly, here is what SEMRush “Site Audit” for a set of pages produced (these are “warnings”):

  • “3 pages have duplicate H1 and title tags”
  • “2 pages have low text-HTML ratio”
  • “2 pages have a low word count”
  • “1 page doesn’t have enough text within the title tags”
  • “1 page doesn’t have an h1 heading”

These are stupid metrics. Every page does not need a predetermined static ratio of “text/html”, word count, etc. “Text/html” sounds nonsense metric anyways. Size/amount of HTML matters only for loading speeds, not search engine rankings or user experience. There is nothing wrong with identical H1 and titles.

This is pedantic bickering and pontification.

Seositecheckup.com Analysis

Similarly look at what “seositecheckup.com” says. I have presented it in form “priority: issue”.

  • HIGH: Connect your webpage with social media networks using APIs or AddThis, as social signals are becoming increasingly important for search engines to validate a site’s trustworthiness and authority.
  • MEDIUM: Add a Google Analytics script to this website to help in diagnosing potential SEO issues by monitoring site visitors and traffic sources.
  • HIGH: To improve the website experience for your visitors, it is recommended to eliminate any render-blocking resources on this webpage.
  • HIGH: Consider reducing the HTML size to improve loading times and retain visitors.
  • MEDIUM: This webpage contains too many H1 tags! H1 tags should re-inforce the intended topic of your page to search engines - too many tags may make the topic less clear, or look like spam tactics. Consider using less than 5 H1 tags.

Regarding these specific issues raised by seositecheckup: the absence of Google Analytics or social media integration is not necessarily a critical flaw. It does not necessitate a high or medium level priority. Its absence can be perfectly acceptable and, in fact, highly desirable in many cases, but the tool is too dumb to realize that.

I do not see any render-blocking resources. All link tags in the head tag come right at the top before any script tags, with the CSS link tag appearing first. The page is static HTML with minimal embedded JS. It is served from a CloudFront CDN, so it loads insanely fast. It is virtually impossible to have serious performance issues with this setup.

Here is the performance as reported by that site itself (first value being my actual value, the second value being the recommended value):

  • Time To First Byte Test: 0.207 seconds, 0.8 seconds
  • First Contentful Paint Test: 0.761 seconds, 1.8 seconds
  • Largest Contentful Paint Test: 0.76 seconds, 2.5 seconds

As you can see, the tool is likely categorically incorrect about the claim regarding render-blocking resources, and even if it is not, it is a complete non-issue until I introduce heavy JS or make the page 10x longer/heavier. As for HTML size, as I said before, the content is academic notes. Naturally, it is very long. I cannot simply shorten it without diminishing its value.

The Nature of Tool-Generated Scores and Recommendations

At first glance, it might seem that the problems highlighted by these tools arise simply because my content is quite different, and perhaps SEO tools cannot differentiate between a generic article and academic notes. But that is a superficial understanding. The core problem, evident across these examples, is the persistent push for convergence with competition. This is a terrible, counter-productive, and possibly dangerous strategy over the long term.

Furthermore, ratings or scores by these tools (including premium ones like SEMRush and free ones like seositecheckup.com) are often based on the potential for improvement as defined by their limited rule sets, not an absolute assessment of the page or site’s inherent quality or effectiveness. Any deviation becomes an “issue.”

The Self-Perpetuating SEO Industry

This model, where scores reflect “fixable” deviations rather than absolute quality, is perfect for an endless cat-and-mouse game and continuous pontification. Cynically, one might observe that any issue identified by such a tool provides an easy justification for its own existing or continuing subscription or contract, thereby fueling the self-perpetuating nature of the SEO tool industry. The tools create the “problems” they then offer to solve.

The Universal Context Deficit: Why SEO Struggles with All Content

The challenges previously highlighted with academic content are not isolated incidents affecting only niche or specialized material; they are symptomatic of a broader, systemic issue. The “intelligence gap” of SEO tools and many prevalent SEO methodologies extends to virtually all content because these systems fundamentally lack the capacity for genuine contextual understanding. Their “dumbness” is universal.

Whether the content is profoundly unique or perfectly generic, these tools operate without grasping its intrinsic purpose, the specific user intent it aims to satisfy, or the nuanced value it might offer. They are programmed with rules derived from observing superficial patterns in vast datasets, often correlating features of high-ranking content without comprehending the causal reasons for that success.

For generic, mass-market content, this inherent lack of deep understanding can sometimes result in recommendations that, coincidentally, align with effective practices for that content type. If a tool suggests adding common keywords or structuring a page like other successful generic pages, it might appear helpful. However, this is not because the tool understands why these elements work for that audience or topic; it’s merely because the content itself fits simple, pre-digested patterns that the tool can recognize. The tool isn’t intelligent; it’s just that the content is, in a sense, “dumbed down” to its level.

The moment content deviates from these broad, simplistic patterns—whether through specialized depth, creative expression, technical specificity, or serving a niche with unconventional needs—this universal lack of contextual intelligence becomes an active liability. If content is intentionally concise for a time-poor professional audience, it might be flagged for “low word count.” If it uses essential specialized jargon for its expert target readers, it risks being penalized for “poor readability” against a baseline of general-audience prose. If it pioneers a new perspective, it may be deemed “lacking in relevant keywords” simply because those keywords aren’t yet common.

Essentially, the entire SEO apparatus, particularly its automated tooling, is built to process surface-level characteristics. It understands very little about the actual quality, strategic intent, or contextual appropriateness of any content. The critique that “it’s dumb for all content; it understands nothing [of substance]” holds true. While this may be masked or even seem beneficial for content that already conforms to the lowest common denominator, for anything aspiring to be more, this fundamental deficit ensures that blindly following generic SEO advice can actively degrade value by pushing all content towards an unhelpful, undifferentiated median.

The Convergence Trap: How SEO Tools Drive Homogenization

The fundamental flaw with most SEO tools lies in their utter lack of genuine intelligence; they possess zero capacity to interpret what any given metric truly means in a specific context. Consequently, they operate on a crude principle: flag every single deviation from the aggregated “competition” data as problematic.

If these recommendations are applied broadly and over the long term, the outcome is not just unhelpful, it is actively destructive. This process will systematically eliminate and destroy whatever differentiation or uniqueness you have conciously and strategically built and your site currently possesses. Your content, structure, and even strategic focus will inexorably converge onto those of your competitors. Everything homogenizes. Your site becomes truly generic.

The insidious nature of this trap is that it can seem to offer initial benefits. Chasing tool-prescribed metrics might yield fleeting upticks in some isolated keyword rankings. However, this is a pyrrhic victory. Over time, as your site sheds its distinctiveness and melts into the undifferentiated morass of competitors, both search performance and user satisfaction will inevitably degrade.

Put bluntly, SEO tools, in their misguided attempt to help you improve rankings, can paradoxically make your rankings worse than when you started. This isn’t merely “problematic”; it is a direct path to digital oblivion, digital suicide, a virtually causal route to fatal underperformance.

This blind adherence to tool-driven convergence represents a complete abdication of strategic thinking. Any serious, sustainable long-term SEO or growth strategy for a website must inherently involve some form of differentiation and the exploitation of an existing market gap or underserved need. Differentiation and gap exploitation are not just SEO tactics; they form the very foundation and core of modern business strategy itself. By blindly following the prescriptions of these unintelligent tools, you abandon this crucial strategic layer.

You cease to think about what makes your offering unique or how to best serve a specific audience. Instead, you begin to merely mimic the competition, converging onto their existing patterns. In doing so, you lose sight of what makes a website truly good, exceptional, or valuable for its target audience. You become alienated from your users themselves, as your focus shifts from their needs and interests to an abstract chase of keywords, clicks, and arbitrary metrics.

The final nail in the coffin for this approach is the relentless evolution of search engine intelligence. Google, and search engines more generally, have incredibly strong incentives to assess sites and pages as intelligently as possible, and they have tangibly and visibly continued to improve their capabilities in this regard. As your site, through tool-driven convergence, becomes by definition more mediocre and average, it sets itself on a collision course with these smarter algorithms. The more intelligent search engines become, the more acutely they will recognize your homogenized content as less unique, less valuable, and ultimately, less “good.” This isn’t a speculative risk; it’s an inevitable disaster waiting to happen for any site that sacrifices its uniqueness at the altar of simplistic SEO tool metrics.

The Wider Malaise of Non-Foundational SEO

The issues highlighted with tool-driven convergence and content misinterpretation are but facets of a larger, more pervasive problem within the broader domain of Search Engine Optimization, particularly those aspects that extend beyond core technical necessities. The non-foundational SEO landscape often resembles a chaotic marketplace characterized by pedantic bickering and incessant pontification. It has acquired, not entirely without justification, a reputation akin to “snake oil” peddling.

This realm is extremely opinionated; for any given SEO challenge or strategy, one can find a myriad of conflicting opinions, often presented with unearned authority. Many of these claims, theories, and proposed strategies are difficult, if not impossible, to verify or test with any degree of scientific rigor. They frequently rely on anecdotal evidence, correlation mistaken for causation, or interpretations of search engine behavior that are speculative at best. The result is an environment where much time and energy are expended chasing elusive “best practices” that may be outdated, irrelevant to a specific context, or simply incorrect.

Furthermore, this pursuit demands significant resources. Beyond the considerable time investment required to merely stay abreast of the shifting sands of SEO discourse, there’s the pressure to utilize expensive paid tools. Subscriptions for comprehensive platforms like SEMRush can run into hundreds of dollars per month, a substantial cost for individuals or small entities. Yet, the return on this investment of time, energy, and money is often highly uncertain.

While foundational technical SEO (ensuring site crawlability, proper indexing, structured data, mobile-friendliness, and site speed) provides clear benefits, the vast ocean of “advanced” or “strategic” SEO advice frequently leads to marginal, unpredictable, or even counterproductive outcomes, especially when it encourages the kind of homogenization or content degradation previously discussed. The pursuit of these ephemeral gains can distract from the core task of creating genuinely valuable, unique content and experiences for the target audience.

Conclusion: What Really Matters in SEO

Put bluntly, SEO, besides the hardcore technical/foundational aspects (like including structured JSON-LD/Microdata/RDFa data in the head tag, including robots.txt and meta robots directives, analytically and data-driven profiling of a set of keywords for, say, a handmade coffee mug store, and so on), is largely a waste of time and counter-productive. It can be antithetical and possibly dangerous or fatal to long-term growth, development, or other goals.

SEO as a field is highly opinionated—100 people often have 100 different opinions—and these opinions are hard to prove or disprove definitively. While having a strategy to achieve rankings on search pages is perfectly reasonable and possibly highly desirable, chasing arbitrary metrics pushed by SEO tools, or getting lost in the speculative noise of non-foundational SEO, represents a futile endeavor that yields minimal impact while potentially compromising content quality and uniqueness.