{"id":1145,"date":"2025-09-16T06:41:51","date_gmt":"2025-09-16T06:41:51","guid":{"rendered":"https:\/\/ambedkarsociety.org\/assa\/?p=1145"},"modified":"2025-10-06T06:51:07","modified_gmt":"2025-10-06T06:51:07","slug":"india-ai-is-learning-caste-bias-in-india-who-will-audit-it-for-discriminations","status":"publish","type":"post","link":"https:\/\/ambedkarsociety.org\/assa\/india-ai-is-learning-caste-bias-in-india-who-will-audit-it-for-discriminations\/","title":{"rendered":"INDIA: AI is learning caste bias in India. Who will audit it for discriminations?"},"content":{"rendered":"<div class=\"wp-block-image\">\n<figure class=\"alignleft size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"450\" src=\"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg\" alt=\"\" class=\"wp-image-1150\" style=\"width:503px;height:auto\" srcset=\"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg 800w, https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-300x169.jpeg 300w, https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-768x432.jpeg 768w\" sizes=\"auto, (max-width: 800px) 100vw, 800px\" \/><\/figure><\/div>\n\n\n<p>A young recruiter in Bengaluru told me that the algorithm doesn\u2019t discriminate. \u201cIt just looks at data.\u201d I wanted to believe him.&nbsp;<\/p>\n\n\n\n<p>As a content and brand marketer who\u2019s embraced tech, I have seen the upside and the ugly. The upside is speed, scope, and new forms of storytelling. But the same tools can quietly reproduce the old biases that my people from marginalised castes already fight every day. In short, the machine promises neutrality, but the data it eats often isn\u2019t neutral. That\u2019s not abstract\u2014there\u2019s mounting evidence from India and global research that algorithms trained on skewed data can silence, misclassify, misrepresent, and financially exclude Dalits at scale. The more I see how these systems operate, the more I worry that AI may not free us from prejudice. It may fossilise it.&nbsp;<\/p>\n\n\n\n<p>We often assume technology is neutral, a clean break from the messiness of human bias. Yet, algorithms are only as fair as the data they are trained on. And in India, that data is soaked in centuries of caste hierarchy. CVs from upper-caste names dominate elite institutions. Dalit and Muslim-majority districts are tagged as \u201chigh risk\u201d because poverty is entrenched there. Content moderation systems, primed by majoritarian sentiment, routinely flag Ambedkarite assertion as \u201chate speech\u201d while casteist slurs (<em>bhimta, chamar, chandal, ricebag,&nbsp;<\/em>etc) slip through unchecked. AI is faithfully learning the old prejudices.&nbsp;<\/p>\n\n\n\n<p><strong>Discrimination hiding behind objectivity<\/strong><\/p>\n\n\n\n<p>Generative AI image\/video models trained on massive web datasets frequently reproduce racial and socioeconomic&nbsp;stereotypes. Prompts for \u201chigh-paying jobs\u201d produce lighter-skinned, \u2018upper-caste\u2019 looking images; prompts for low-status work produce darker-skinned figures or visual cues of poverty. Recent academic work specifically quantifies India-centric biases in image output and shows models equate \u201cIndianness\u201d with higher-caste visual markers while depicting Dalit identities with poverty markers. It matters for brand creatives, art directors and producers who increasingly use AI tools for mood boards, casting mockups, and campaign images.<\/p>\n\n\n\n<p>In practice, this means that a Dalit actor or protagonist often ends up depicted in stereotyped ways in AI mockups used to pitch a series\u2014 and brand creatives who rely on them amplify harmful frames without intent.&nbsp;<\/p>\n\n\n\n<p>Recommendation engines power who gets seen on YouTube, Instagram reels, Spotify playlists and even publisher homepages. If the training signals (likes, comments, reshares) favour mainstream and majoritarian content, Dalit storytelling (nuanced caste critique, oral histories, counter-narratives) struggles to break the \u201cengagement\u201d loop that amplifies content.&nbsp;Investigations&nbsp;of platform dynamics in India show majoritarian patterns persist online, and experts warn that algorithmic curation can entrench these patterns.&nbsp;<\/p>\n\n\n\n<p>A Dalit writer\u2019s long-form explainer or a producer\u2019s short documentary gets poor recommendations, not because of quality, but because the model\u2019s engagement priors undervalue its audience or mislabel its tone.&nbsp;<\/p>\n\n\n\n<p>The real danger is not just discrimination, but that it hides behind a facade of objectivity. A recruiter can shrug: the system rejected the candidate. A bank can say: the algorithm flagged the loan<em>.&nbsp;<\/em>Responsibility gets outsourced to the machine, while prejudice is quietly legitimised by the authority of mathematics.&nbsp;<\/p>\n\n\n\n<p>This \u201ctech washing\u201d is especially dangerous in India, where digital systems are becoming the backbone of everyday life, eg, Aadhaar for identity, UPI for payments, predictive analytics for welfare and even policing. A single biased recruiter may block a few careers; a biased algorithm can exclude millions at once, invisibly.&nbsp;<\/p>\n\n\n\n<p>This is not a distant risk. A neo-Buddhist entrepreneur in Uttar Pradesh may be denied credit because his pin code is flagged as high-risk. On social media, Dalit voices celebrating BR Ambedkar can be silenced by moderation systems that normalise dominance but punish assertion. Each decision feels technical, but together they map a bleak picture: the future being shaped by machines that think exactly like the casteist society we live in.&nbsp;<\/p>\n\n\n\n<p><strong>What this looks like in my work sphere&nbsp;<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>A social media manager who is Dalit finds her Ambedkar-themed campaign repeatedly mass-reported and auto-flagged; reviews take days, and the campaign loses momentum, while posts with casteist insults remain visible.\u00a0<\/li>\n\n\n\n<li>A junior copywriter with a non-Anglicised name gets fewer interview calls if an employer uses an ATS shortlist; internal audits later show the ATS ranked candidates using signals correlated with past hires.\u00a0<\/li>\n\n\n\n<li>A brand marketer runs a CSR film about Dalit dignity; ad delivery optimisers serve the film mostly to urban, upper-caste segments (higher click-throughs), so the film fails to reach the marginalised communities it intended to engage.\u00a0<\/li>\n\n\n\n<li>A producer from a Dalit background is denied fintech credit because the model flags the project\u2019s shoot location as \u201chigh-risk\u201d due to historical economic indicators, slowing production.\u00a0<\/li>\n<\/ul>\n\n\n\n<p>There is a fix, and it requires urgency. AI systems must be audited for bias, just as financial accounts are. Datasets must be made more representative, not built solely on histories of exclusion. Companies and governments must be transparent about how algorithms decide who gets jobs, loans, or visibility. Citizens must have the right to question those decisions. Most importantly, Dalit individuals need to be in the rooms where tech is built. Who creates the machine determines whose realities it reflects.&nbsp;<\/p>\n\n\n\n<p>For Dalits, the struggle has always been about fighting invisibility\u2014in classrooms, offices, boardrooms, and politics. Now that the battle extends to the hidden world of algorithms. India stands at a fork: one path leads to a digital future that automates centuries of discrimination; the other to a more inclusive one, where technology finally does what it promises, ie, level the playing field. The choice is ours. But every time an algorithm silently decides who is employable, trustworthy, or heard, tomorrow\u2019s India is already being coded.<\/p>\n\n\n\n<p>I believe in tech; it\u2019s the best tool we have to amplify unheard voices, but belief alone isn\u2019t enough. If we let convenience and opaque optimisation dictate how creative talent is discovered, represented and funded, we risk building a digital culture that mirrors caste hierarchies. To avoid that, we must demand transparency, diversify the rooms where models are built, and insist that every tool shaping careers, funding and visibility is audited for caste impact. Otherwise, the future is where creativity is curated by biased models.<\/p>\n\n\n\n<p><strong>Source:<a href=\"https:\/\/theprint.in\/opinion\/ai-is-learning-caste-bias-in-india\/2744333\"> The Print <\/a><\/strong><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A young recruiter in Bengaluru told me that the algorithm doesn\u2019t discriminate. \u201cIt just looks at data.\u201d I wanted to believe him.&nbsp; As a content and brand marketer who\u2019s embraced &hellip; <\/p>\n","protected":false},"author":1,"featured_media":1150,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-1145","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news"],"magazineBlocksPostFeaturedMedia":{"thumbnail":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-150x150.jpeg","medium":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-300x169.jpeg","medium_large":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-768x432.jpeg","large":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg","1536x1536":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg","2048x2048":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg","hitmag-landscape":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg","hitmag-featured":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-735x400.jpeg","hitmag-grid":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-348x215.jpeg","hitmag-list":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-290x220.jpeg","hitmag-thumbnail":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-135x93.jpeg","bdpp-medium":"https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-640x450.jpeg"},"magazineBlocksPostAuthor":{"name":"ambedkarsociety.org\/assa","avatar":"https:\/\/secure.gravatar.com\/avatar\/d54a4e1d0f529f0b1d16c3f7071a5b44?s=96&d=mm&r=g"},"magazineBlocksPostCommentsNumber":"0","magazineBlocksPostExcerpt":"A young recruiter in Bengaluru told me that the algorithm doesn\u2019t discriminate. \u201cIt just looks at data.\u201d I wanted to believe him.&nbsp; As a content and brand marketer who\u2019s embraced &hellip; ","magazineBlocksPostCategories":["News"],"magazineBlocksPostViewCount":113,"magazineBlocksPostReadTime":6,"magazine_blocks_featured_image_url":{"full":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg",800,450,false],"medium":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-300x169.jpeg",300,169,true],"thumbnail":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-150x150.jpeg",150,150,true]},"magazine_blocks_author":{"display_name":"ambedkarsociety.org\/assa","author_link":"https:\/\/ambedkarsociety.org\/assa\/author\/papayawhip-dragonfly-555722-hostingersite-com-2-2\/"},"magazine_blocks_comment":4,"magazine_blocks_author_image":"https:\/\/secure.gravatar.com\/avatar\/d54a4e1d0f529f0b1d16c3f7071a5b44?s=96&d=mm&r=g","magazine_blocks_category":"<a href=\"https:\/\/ambedkarsociety.org\/assa\/category\/news\/\" rel=\"category tag\">News<\/a>","featured_image_urls":{"full":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg",800,450,false],"thumbnail":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-150x150.jpeg",150,150,true],"medium":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-300x169.jpeg",300,169,true],"medium_large":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-768x432.jpeg",735,413,true],"large":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg",735,413,false],"1536x1536":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg",800,450,false],"2048x2048":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg",800,450,false],"hitmag-landscape":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497.jpeg",800,450,false],"hitmag-featured":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-735x400.jpeg",735,400,true],"hitmag-grid":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-348x215.jpeg",348,215,true],"hitmag-list":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-290x220.jpeg",290,220,true],"hitmag-thumbnail":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-135x93.jpeg",135,93,true],"bdpp-medium":["https:\/\/ambedkarsociety.org\/assa\/wp-content\/uploads\/2025\/10\/1758011250497-640x450.jpeg",640,450,true]},"author_info":{"info":["ambedkarsociety.org\/assa"]},"category_info":"<a href=\"https:\/\/ambedkarsociety.org\/assa\/category\/news\/\" rel=\"category tag\">News<\/a>","tag_info":"News","comment_count":"0","_links":{"self":[{"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/posts\/1145","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/comments?post=1145"}],"version-history":[{"count":1,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/posts\/1145\/revisions"}],"predecessor-version":[{"id":1151,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/posts\/1145\/revisions\/1151"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/media\/1150"}],"wp:attachment":[{"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/media?parent=1145"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/categories?post=1145"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ambedkarsociety.org\/assa\/wp-json\/wp\/v2\/tags?post=1145"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}