{
  "title": "Articles/data-quality-over-quantity",
  "caption": "Data Quality Over Quantity",
  "slug": "data-quality-over-quantity",
  "tags": [
    "article",
    "choir-substack",
    "hermes-published",
    "imported-substack",
    "published"
  ],
  "canonical_url": "https://mosiah.org/articles/data-quality-over-quantity/",
  "interactive_url": "https://mosiah.org/#Articles%2Fdata-quality-over-quantity",
  "markdown_url": "https://mosiah.org/articles/data-quality-over-quantity.md",
  "json_url": "https://mosiah.org/json/data-quality-over-quantity.json",
  "fields": {
    "caption": "Data Quality Over Quantity",
    "created": "20260510152127400",
    "modified": "20260510152127400",
    "original-date": "2024-07-15T19:51:58.016Z",
    "original-url": "https://choir.substack.com/p/data-quality-over-quantity",
    "tags": "article hermes-published published imported-substack choir-substack",
    "title": "Articles/data-quality-over-quantity",
    "type": "text/vnd.tiddlywiki"
  },
  "text": "# Data Quality Over Quantity\n\n//The Key to Superior AI Models//\n\n//Related:// [[sources|Article Sources/data-quality-over-quantity]] · [[notes|Article Notes/data-quality-over-quantity]] · [[metadata|Article Metadata/data-quality-over-quantity]] · [[Published Pieces]]\n\nIn the realm of artificial intelligence and machine learning, a common misconception is that more data always leads to better models. While data quantity is important, the quality of that data is often the determining factor in creating truly exceptional AI systems. This piece explores the critical importance of data quality and its impact on AI model performance.\n\n<div class=\"captioned-image-container\">\n\n<figure>\n<a href=\"https://substackcdn.com/image/fetch/$s_!cEtt!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp\" class=\"image-link image2 is-viewable-img\" target=\"_blank\"></a>\n<div class=\"image2-inset\">\n<img src=\"https://substackcdn.com/image/fetch/$s_!cEtt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp\" class=\"sizing-normal\" data-attrs=\"{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:517066,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" srcset=\"https://substackcdn.com/image/fetch/$s_!cEtt!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!cEtt!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!cEtt!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!cEtt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F951fcac5-3f0d-4c04-8b8a-82839b30cea0_1792x1024.webp 1456w\" sizes=\"100vw\" data-fetchpriority=\"high\" width=\"1456\" height=\"832\" />\n<div class=\"image-link-expand\">\n<div class=\"pencraft pc-display-flex pc-gap-8 pc-reset\">\n<img src=\"data:image/svg+xml;base64,PHN2ZyByb2xlPSJpbWciIHdpZHRoPSIyMCIgaGVpZ2h0PSIyMCIgdmlld2JveD0iMCAwIDIwIDIwIiBmaWxsPSJub25lIiBzdHJva2Utd2lkdGg9IjEuNSIgc3Ryb2tlPSJ2YXIoLS1jb2xvci1mZy1wcmltYXJ5KSIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjxnPjx0aXRsZT48L3RpdGxlPjxwYXRoIGQ9Ik0yLjUzMDAxIDcuODE1OTVDMy40OTE3OSA0LjczOTExIDYuNDMyODEgMi41IDkuOTExNzMgMi41QzEzLjE2ODQgMi41IDE1Ljk1MzcgNC40NjIxNCAxNy4wODUyIDcuMjM2ODRMMTcuNjE3OSA4LjY3NjQ3TTE3LjYxNzkgOC42NzY0N0wxOC41MDAyIDQuMjY0NzFNMTcuNjE3OSA4LjY3NjQ3TDEzLjY0NzMgNi45MTE3Nk0xNy40OTk1IDEyLjE4NDFDMTYuNTM3OCAxNS4yNjA5IDEzLjU5NjcgMTcuNSAxMC4xMTc4IDE3LjVDNi44NjExOCAxNy41IDQuMDc1ODkgMTUuNTM3OSAyLjk0NDMyIDEyLjc2MzJMMi40MTE2NSAxMS4zMjM1TTIuNDExNjUgMTEuMzIzNUwxLjUyOTMgMTUuNzM1M00yLjQxMTY1IDExLjMyMzVMNi4zODIyNCAxMy4wODgyIiAvPjwvZz48L3N2Zz4=\" />\n<img src=\"data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSIyMCIgaGVpZ2h0PSIyMCIgdmlld2JveD0iMCAwIDI0IDI0IiBmaWxsPSJub25lIiBzdHJva2U9ImN1cnJlbnRDb2xvciIgc3Ryb2tlLXdpZHRoPSIyIiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiIGNsYXNzPSJsdWNpZGUgbHVjaWRlLW1heGltaXplMiBsdWNpZGUtbWF4aW1pemUtMiI+PHBvbHlsaW5lIHBvaW50cz0iMTUgMyAyMSAzIDIxIDkiPjwvcG9seWxpbmU+PHBvbHlsaW5lIHBvaW50cz0iOSAyMSAzIDIxIDMgMTUiPjwvcG9seWxpbmU+PGxpbmUgeDE9IjIxIiB4Mj0iMTQiIHkxPSIzIiB5Mj0iMTAiPjwvbGluZT48bGluZSB4MT0iMyIgeDI9IjEwIiB5MT0iMjEiIHkyPSIxNCI+PC9saW5lPjwvc3ZnPg==\" class=\"lucide lucide-maximize2 lucide-maximize-2\" />\n</div>\n</div>\n</div>\n</figure>\n\n</div>\n\nQuality / Quantity, if this can be measured, is an efficiency measure representing the value of the average token in the training dataset.\n\nOne proxy for content quality is the number of citations and references to it. Think of the Google Pagerank algorithm, itself modeled after academic citation counting.\n\n## The Fundamental Principle: Data In, Data Out\n\nAt its core, machine learning is about learning patterns and distributions from data. The age-old programming adage \"garbage in, garbage out\" applies just as strongly to AI model training. If we feed low-quality, noisy, or irrelevant data into our models, we can expect similarly low-quality outputs. Conversely, high-quality, relevant, and insightful data can lead to models that produce brilliant and valuable results.\n\n## Not All Data Is Created Equal\n\nWhen considering the value of different data sources, it's crucial to understand that some types of content provide more value per token than others. Here's a speculative hierarchy of data quality:\n\n1.  High-Quality Sources:\n\n    - Books\n\n    - Academic papers\n\n    - Well-crafted essays\n\n    - Carefully prepared speeches\n\n    - Transcripts of produced audio/visual content\n\n    - Artifacts of significant cultural relevance, e.g., quotes\n\n    These sources typically contain well-thought-out ideas, structured arguments, and rich, contextual information. They often represent the distilled knowledge and insights of experts in their fields.\n\n2.  Medium-Quality Sources:\n\n    - News articles\n\n    - Blog posts\n\n    - Podcasts\n\n    - Forum discussions on specialized topics\n\n    While these can be valuable, their quality can vary widely. They may contain useful information but might lack the depth and rigor of more formal sources. They are less likely to be cited by many other texts than high-quality sources.\n\n3.  Low-Quality Sources:\n\n    - Emails (often perfunctory and referencing external content)\n\n    - Tweets and social media posts\n\n    - Chat logs\n\n    - Comments sections\n\n    These sources typically have low signal-to-noise ratios. They often contain incomplete thoughts, lack context, or focus on ephemeral topics, and are apt to be factually false and socially irrelevant, with effectively zero citations.\n\n## The Impact on AI Models\n\nTraining models on higher-quality data can lead to several benefits:\n\n1.  Better Understanding: Models trained on well-articulated, contextually rich data are more likely to grasp nuanced concepts and complex relationships.\n\n2.  Improved Generalization: High-quality data often covers topics more comprehensively, allowing models to generalize better to new, unseen scenarios.\n\n3.  Reduced Bias and Noise: Curated, high-quality datasets are less likely to contain the kinds of biases and noise prevalent in more casual forms of communication.\n\n4.  More Valuable Outputs: Models trained on insightful, well-structured data are more likely to produce similarly valuable outputs when prompted.\n\n## The Quality-Quantity Balance\n\nWhile this piece emphasizes quality, it's important to note that some quantity is still necessary. The ideal scenario is to have a large volume of high-quality data. However, when faced with a trade-off, it's often better to have a smaller dataset of excellent quality than a massive dataset of low-quality information. It’s hard to definitively prove this without access to the datasets — top secret information — used to train frontier AI models. But as some anecdotal evidence, consider xAI’s Grok, the model built on data from X (formerly Twitter); it has notably poor performance given its high parameter count.\n\n## “Conclusion”\n\nAs we continue to advance the field of artificial intelligence, the focus should shift from merely accumulating vast amounts of data to curating high-quality, valuable datasets. Indeed, as we approach the limits of accessible human-originated data, we need to emphasize data quality to improve model output quality.\n\nBy prioritizing data quality over sheer quantity, we can create AI models that not only process information more effectively but also generate more insightful, accurate, and valuable outputs. Remember: in the world of AI, it's not just about how much your model knows, but about the quality of what it knows.\n\n---\n\n//Originally published on Choir Substack: [[https://choir.substack.com/p/data-quality-over-quantity|https://choir.substack.com/p/data-quality-over-quantity]].//\n"
}