{"id":359,"date":"2021-03-14T23:00:00","date_gmt":"2021-03-14T23:00:00","guid":{"rendered":"https:\/\/themarkaz.org\/oldsite\/2021\/03\/the-new-gatekeepers-andy-lee-roth\/"},"modified":"2025-08-21T08:58:56","modified_gmt":"2025-08-21T06:58:56","slug":"the-new-gatekeepers-andy-lee-roth","status":"publish","type":"post","link":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/","title":{"rendered":"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see"},"content":{"rendered":"<p>&nbsp;<\/p>\n<h4>Andy Lee Roth<\/h4>\n<p>&nbsp;<\/p>\n<p>Algorithms, artificial intelligence programs controlled by Big Tech companies including Google, Facebook and Twitter\u2014corporations with no commitment to ethical journalism\u2014are the new gatekeepers. More and more, proprietary algorithms rather than newsroom editors determine which news stories circulate widely, raising serious concerns about transparency and accountability in determinations of newsworthiness.<\/p>\n<p>The rise of what is best understood as algorithmic censorship makes newly relevant the old concept of &#8220;gatekeeping&#8221; in ways that directly address previous critiques of how we get our news. To illustrate the power of algorithms to control the flow of information, consider the example of what happened to the digital record of an academic conference that I attended last year.<strong>\u00a0<\/strong><\/p>\n<p><strong><br \/>\nYouTube and the Critical Media Literacy Conference of the Americas<\/strong><\/p>\n<p>In October 2020 I participated in an academic conference focused on media literacy education. The <a href=\"https:\/\/www.projectcensored.org\/critical-media-literacy-conference-of-the-americas\/\">event<\/a> brought together the field&#8217;s leading figures for two days of scholarly panels and discussions. Many of the participants, including those in a session I moderated, raised questions about the impact of Big Tech companies such as Google and Facebook on the future of journalism and criticized how <a href=\"https:\/\/www.projectcensored.org\/how-mainstream-media-evolved-into-corporate-media-a-project-censored-history\/\"><em>corporate<\/em> news media<\/a>\u2014including not only Fox News and MSNBC but also the New York Times and Washington Post\u2014often impose narrow definitions of newsworthiness. In other words, the conference was like many others I&#8217;ve attended, except that due to the pandemic we met virtually via Zoom.<\/p>\n<p>After the conference concluded, its organizers uploaded video recordings of the keynote session and more than twenty additional hours of conference presentations to a <a href=\"https:\/\/www.youtube.com\/channel\/UCWo6nbsYT3a5jfHIRF3WmtQ\">YouTube channel<\/a> created to make those sessions available to a wider public.<\/p>\n<figure style=\"width: 486px\" class=\"wp-caption alignleft\"><a class=\" sqs-block-image-link \" href=\"https:\/\/www.project-censored.org\/shop\/p\/project-censored-2021-state-of-the-free-press\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" src=\"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/7S-Censored-2021-cover-750-pix.jpg\" alt=\"Project Censored's State of the Free Press | 2021 surveys \" width=\"486\" height=\"750\" \/><\/a><figcaption class=\"wp-caption-text\">Project Censored&#8217;s State of the Free Press | 2021 surveys &#8220;the desolate landscape of corporate news reporting, where powerful forces interlock to restrict the free flow of information&#8230;&#8221;<\/figcaption><\/figure>\n<p>Several weeks later, YouTube removed all of the conference videos, without any notification or explanation to the conference organizers. As <a href=\"https:\/\/www.mintpressnews.com\/media-censorship-conference-censored-youtube\/274918\/\">MintPress News reported<\/a>, an academic conference at which many participants raised warnings about &#8220;the dangers of media censorship&#8221; had, ironically, &#8220;been censored by YouTube.&#8221; Despite the organizers&#8217; subsequent formal appeals, YouTube refused to restore any of the deleted content; instead, it declined to acknowledge the content was ever posted in the first place.<\/p>\n<p>Through my work with <a href=\"https:\/\/www.projectcensored.org\/\" target=\"_blank\" rel=\"noopener\">Project Censored<\/a>, a nonprofit news watchdog with a global reputation for opposing news censorship and championing press freedoms, I was familiar with online content filtering. Thinking about YouTube&#8217;s power to delete the public video record of an academic conference, without explanation, initially reminded me of the &#8220;memory holes&#8221; in George Orwell&#8217;s <em>Nineteen Eighty-Four<\/em>. In Orwell&#8217;s dystopian novel, memory holes efficiently whisk away for destruction any evidence that might conflict with or undermine the government&#8217;s interests, as determined by the Ministry of Truth.<\/p>\n<p>But I also found myself recalling a theory of news production and distribution that enjoyed popularity in the 1950s but has since fallen from favor. I&#8217;ve come to understand YouTube&#8217;s removal of the conference videos as (a new form of) <em>gatekeeping<\/em>\u2014the concept developed by David Manning White and Walter Gieber in the 1950s to explain how newspaper editors determined what stories to publish as news.<\/p>\n<h4>The original gatekeeping model<\/h4>\n<p>White studied the decisions of a wire editor at a small midwestern newspaper, examining the reasons that the editor, whom White called &#8220;Mr. Gates,&#8221; gave for selecting or rejecting specific stories for publication. Mr. Gates rejected some stories for practical reasons\u2014&#8221;too vague,&#8221; &#8220;dull writing,&#8221; or &#8220;too late\u2014no space.&#8221; But in 18 of the 423 decisions that White examined, Mr. Gates rejected stories for political reasons, rejecting stories as &#8220;pure propaganda&#8221; or &#8220;too red,&#8221; for example.\u00a0 White concluded his <a href=\"http:\/\/www.aejmc.org\/home\/wp-content\/uploads\/2012\/09\/Journalism-Quarterly-1950-White-383-90.pdf\">1950 article<\/a> by emphasizing &#8220;how highly subjective, how based on the gatekeeper&#8217;s own set of experiences, attitudes and expectations the communication of \u2018news&#8217; really is.&#8221;<\/p>\n<p>In 1956, Walter Gieber conducted a similar study, this time examining the decisions of 16 different wire editors. Gieber&#8217;s <a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/107769905603300401\">findings<\/a> refuted White&#8217;s conclusion of gatekeeping as subjective. Instead, Gieber found that, independently of one another, editors made much the same decisions. Gatekeeping was real, but the editors treated story selection as a rote task, and they were most concerned with what Gieber described as &#8220;goals of production&#8221; and &#8220;bureaucratic routine&#8221;\u2014not, in other words, with advancing any particular political agenda. More recent studies have <a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/107769900107800202\">reinforced and refined<\/a> Gieber&#8217;s conclusion that professional assessments of &#8220;newsworthiness,&#8221; not political partisanship, guide news workers&#8217; decisions about what stories to cover.<\/p>\n<p>The gatekeeping model fell out of favor as newer theoretical models\u2014including &#8220;framing&#8221; and &#8220;agenda setting&#8221;\u2014seemed to explain more of the news production process. In an influential <a href=\"https:\/\/journals.sagepub.com\/doi\/abs\/10.1177\/016344389011003002\">1989 article<\/a>, sociologist Michael Schudson described gatekeeping as &#8220;a handy, if not altogether appropriate, metaphor.&#8221; The gatekeeping model was problematic, he wrote, because &#8220;it leaves \u2018information&#8217; sociologically untouched, a pristine material that comes to the gate already prepared.&#8221; In that flawed view &#8220;news&#8221; is preformed, and the gatekeeper &#8220;simply decides which pieces of prefabricated news will be allowed through the gate.&#8221; Although White and others had noted that &#8220;gatekeeping&#8221; occurs at multiple stages in the news production process, Schudson&#8217;s critique stuck.<\/p>\n<p>With the advent of the Internet, some scholars attempted to revive the gatekeeping model. New studies showed how <a href=\"https:\/\/aisel.aisnet.org\/cgi\/viewcontent.cgi?article=1049&amp;context=thci\"><em>audiences<\/em> increasingly act as gatekeepers<\/a>, deciding which news items to pass along via their own social media accounts. But, overall, gatekeeping seemed even more dated: &#8220;The Internet defies the whole notion of a \u2018gate&#8217; and challenges the idea that journalists (or anyone else) can or should limit what passes through it,&#8221; Jane B. Singer <a href=\"https:\/\/openaccess.city.ac.uk\/id\/eprint\/3462\/6\/2006SteppingBackJMCQSinger.pdf\">wrote<\/a> in 2006.<\/p>\n<h4><strong>Algorithmic news filtering<\/strong><\/h4>\n<p>Fast forward to the present and Singer&#8217;s optimistic assessment appears more dated than gatekeeping theory itself. Instead, the Internet, and social media in particular, encompass numerous limiting &#8220;gates,&#8221; fewer and fewer of which are operated by news organizations or journalists themselves.<\/p>\n<p>Incidents such as YouTube&#8217;s wholesale removal of the media literacy conference videos are not isolated; in fact, they are increasingly common as privately-owned companies and their media platforms wield ever more power to regulate who speaks online and the types of speech that are permissible.<\/p>\n<p>Independent news outlets\u00a0have documented how Twitter, Facebook, and others have suspended Venezuelan, Iranian, and Syrian accounts and <a href=\"https:\/\/thegrayzone.com\/2020\/01\/12\/us-pressure-social-media-censoring-suspending-venezuela-iran-syria\/\">censored content that conflict with U.S. foreign policy<\/a>; how the Google News aggregator filters out pro-LGBTQ stories while <a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/0306422020917088\">amplifying homophobic and transphobic voices<\/a>; and how changes made by Facebook to its news feed have <a href=\"https:\/\/www.motherjones.com\/politics\/2019\/02\/how-facebook-screwed-us-all\/\">throttled web traffic<\/a> to progressive news outlets.<\/p>\n<p>Some Big Tech companies&#8217; decisions have made headline news. After the 2020 presidential election, for example, Google, Facebook, YouTube, Twitter, and Instagram restricted the online communications of Donald Trump and his supporters; after the January 6 assault on the Capitol, Google, Apple, and Amazon suspended Parler, the social media platform favored by many of Trump&#8217;s supporters.<\/p>\n<p>But decisions to deplatform Donald Trump and suspend Parler differ in two fundamental ways from most other cases of online content regulation by Big Tech companies. First, the instances involving Trump and Parler received widespread news coverage; those decisions became <em>public<\/em> issues and were debated as such. Second, as that news coverage tacitly conveyed, the decisions to restrict Trump&#8217;s online voice and Parler&#8217;s networked reach were made by leaders at Google, Facebook, Apple, and Amazon. They were <em>human<\/em> decisions.<\/p>\n<hr \/>\n<figure style=\"width: 848px\" class=\"wp-caption aligncenter\"><img decoding=\"async\" src=\"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thoughtpolicebyalibanisadr.jpg\" alt=\"&quot;Thought Police&quot; by Ali Banisadr, oil on linen, 82 x 120 inches (2019). Courtesy of the artist.\" width=\"848\" height=\"580\" \/><figcaption class=\"wp-caption-text\">&#8220;Thought Police&#8221; by Ali Banisadr, oil on linen, 82 x 120 inches (2019). Courtesy of the artist.<\/figcaption><\/figure>\n<hr \/>\n<p>This last point was not a focus of the resulting news coverage, but it matters a great deal for understanding the stakes in other cases, where the decision to filter content\u2014in effect, to silence voices and throttle conversations\u2014were made by <em>algorithms<\/em>, rather than humans.<\/p>\n<p>Increasingly the news we encounter is the product of <em>both<\/em> the daily routines and professional judgments of journalists, editors, and other news professionals <em>and<\/em> the assessments of relevance and appropriateness made by artificial intelligence programs that have been developed and are controlled by private for-profit corporations that <a href=\"https:\/\/www.theguardian.com\/technology\/2018\/jul\/02\/facebook-mark-zuckerberg-platform-publisher-lawsuit\">do not see themselves as media companies<\/a> much less ones engaged in journalism. When I search for news about &#8220;rabbits gone wild&#8221; or the Equality Act on Google News, an algorithm employs a variety of confidential criteria to determine what news stories and news sources appear in response to my query. Google News does not produce any news stories of its own but, like Facebook and other platforms that function as news aggregators, it plays an enormous\u2014and <a href=\"https:\/\/www.pewresearch.org\/fact-tank\/2020\/12\/08\/many-americans-are-unsure-whether-sources-of-news-do-their-own-reporting\/\">poorly understood<\/a>\u2014role in determining what news stories many Americans see.<\/p>\n<h4><strong>The new algorithmic gatekeeping<\/strong><\/h4>\n<p>Recall that Schudson criticized the gatekeeping model for &#8220;leaving \u2018information&#8217; sociologically untouched.&#8221; Because news was constructed, not prefabricated, the gatekeeping model failed to address the complexity of the news production process, Schudson contended. That critique, however, no longer applies to the increasingly common circumstances in which corporations such as Google and Facebook, which do not practice journalism themselves, determine what news stories members of the public are most likely to see\u2014and what news topics or news outlets those audiences are unlikely to ever come across, unless they actively seek them out.<\/p>\n<p>In these cases, Google, Facebook, and other social media companies have no hand\u2014or interest\u2014in the <em>production<\/em> of the stories that their algorithms either promote or bury. Without regard for the <a href=\"https:\/\/www.spj.org\/ethicscode.asp\">basic principles of ethical journalism<\/a> as recommended by the Society of Professional Journalists\u2014to seek the truth and report it; to minimize harm; to act independently; and, to be accountable and transparent\u2014the new gatekeepers claim content neutrality while promoting news stories that often fail glaringly to fulfil even one of the SPJ&#8217;s ethical guidelines.<\/p>\n<p>This problem is compounded by the reality that it is impossible for a contemporary version of David Manning White or Walter Gieber to study gatekeeping processes at Google or Facebook: The algorithms engaged in the new gatekeeping are protected from public scrutiny as proprietary intellectual property. As avram anderson [sic] and I have <a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/0306422020917088\">previously reported<\/a>, a class action suit filed against YouTube in August 2019 by LGBT content creators could &#8220;force Google to make its powerful algorithms available for scrutiny.&#8221; Google\/YouTube have <a href=\"https:\/\/www.theverge.com\/2020\/6\/3\/21278050\/youtube-lawsuit-lgbtq-google-doj-section-230-trump-executive-order\">sought to dismiss the case<\/a> on the grounds that its distribution algorithms are &#8220;not content-based.&#8221;<\/p>\n<hr \/>\n<figure style=\"width: 849px\" class=\"wp-caption aligncenter\"><img decoding=\"async\" src=\"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/trustinthefuturebyalibanisadr.jpg\" alt=\"&quot;Trust in the Future&quot; by Ali Banisadr, oil on linen, 82 x 120 inches (2017). Courtesy of the artist.\" width=\"849\" height=\"580\" \/><figcaption class=\"wp-caption-text\">&#8220;Trust in the Future&#8221; by Ali Banisadr, oil on linen, 82 x 120 inches (2017). Courtesy of the artist.<\/figcaption><\/figure>\n<hr \/>\n<h4><strong>Algorithms, human agency, and inequalities<\/strong><\/h4>\n<p>To be accountable and transparent is one of guiding principles for ethical journalism, as advocated by the Society of Professional Journalists. News gatekeeping conducted by proprietary algorithms crosses wires with this ethical guideline, producing grave threats to the integrity of journalism and the likelihood of a well-informed public.<\/p>\n<p>Most often when Google, Facebook, and other Big Tech companies are considered in relation to journalism and the conditions necessary for it to fulfill its fundamental role as the &#8220;Fourth Estate&#8221;\u2014holding the powerful accountable and informing the public\u2014the focus is on how Big Tech has thoroughly appropriated the advertising revenues on which most legacy media outlets depend to stay in business. The rise of algorithmic news gatekeeping should be just as great a concern.<\/p>\n<p>Technologies driven by artificial intelligence reduce the role of human agency in decision making. This is often touted, by advocates of AI, as a selling point: Algorithms replace human subjectivity and fallibility with &#8220;objective&#8221; determinations.<\/p>\n<p>Critical studies of algorithmic bias\u2014including Safiya Umoja Noble&#8217;s <em>Algorithms of Oppression<\/em>, Virginia Eubank&#8217;s <em>Automating Inequality<\/em>, and Cathy O&#8217;Neill&#8217;s <em>Weapons of Math Destruction<\/em>\u2014advise us to be wary of how easy it is to build longstanding human prejudices into &#8220;viewpoint neutral&#8221; algorithms that, in turn, add new layers to deeply-sedimented structural inequalities.<\/p>\n<p>With the new algorithmic gatekeeping of news developing more quickly than public understanding of it, journalists and those concerned with the role of journalism in democracy face multiple threats. We must exert all possible pressure to force corporations such as Google and Facebook to make their algorithms available for third-party scrutiny; at the same time, we must do more to educate the public about this new and subtle wrinkle in the news production process.<\/p>\n<p>Journalists are well positioned to tell this story from first-hand experience, and governmental regulation or pending lawsuits may eventually force Big Tech companies to make their algorithms available for third-party scrutiny. But the stakes are too high to wait on the sidelines for others to solve the problem. So what can we do now in response to algorithmic gatekeeping? I recommend four proactive responses, presented in increasing order of engagement:<\/p>\n<blockquote><p><strong>\u00b7<\/strong>\u00a0\u00a0\u00a0\u00a0\u00a0 <strong>Avoid using &#8220;Google&#8221; as a verb<\/strong>, a common habit that tacitly identifies a generic online activity with the brand name of a corporation that has done as much as any to multiply <a href=\"https:\/\/www.nytimes.com\/2020\/01\/24\/opinion\/sunday\/surveillance-capitalism.html\">epistemic inequality<\/a>, the concept developed by Shoshana Zuboff, author of <em>The Age of Surveillance Capitalism<\/em>, to describe a form of power based on the difference between what we can know and what can be known about us.<\/p>\n<p><strong>\u00b7\u00a0\u00a0\u00a0<\/strong>\u00a0\u00a0 <strong>Remember search engines and social media feeds are not neutral information sources.<\/strong> The algorithms that drive them often serve to reproduce existing inequalities in subtle but powerful ways. Investigate for yourself. Select a topic of interest to you and compare search results from Google and <a href=\"https:\/\/www.phillymag.com\/news\/2021\/02\/20\/duckduckgo-data-privacy-paoli\/\">DuckDuckGo<\/a>.<\/p>\n<p><strong>\u00b7\u00a0\u00a0<\/strong>\u00a0\u00a0\u00a0 <strong>Connect directly to news organizations that display firm commitments to ethical journalism<\/strong>, rather than relying on your social media feed for news. Go to the outlet&#8217;s website, sign up for its email list or RSS feed, subscribe to the outlet&#8217;s print version if there is one. The direct connection removes the social media platform, or search engine, as an unnecessary and potentially biased intermediary.<\/p>\n<p><strong>\u00b7\u00a0\u00a0<\/strong>\u00a0\u00a0\u00a0 <strong>Call out algorithmic bias when you encounter it.<\/strong> Call it out directly to the entity responsible for it; call it out publicly by letting others know about it.<\/p><\/blockquote>\n<p>Fortunately, our human brains can employ new information in ways that algorithms cannot. Understanding the influential roles of algorithms on our lives\u2014including how they operate as gatekeepers of the news stories we are most likely to see\u2014allows us to take greater control of our individual online experiences. Based on greater individual awareness and control, we can begin to organize collectively to expose and oppose algorithmic bias and censorship.<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.<\/p>\n","protected":false},"author":38,"featured_media":3045,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"om_disable_all_campaigns":false,"footnotes":""},"categories":[12,48,50],"tags":[337,457,619,740,1079,1396,1743,1851],"coauthors":[1889],"class_list":["post-359","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-essay","category-truth","category-tmr-issues","tag-big-tech","tag-corporate-media","tag-facebook","tag-google","tag-mainstream-media","tag-press-censorship","tag-twitter","tag-youtube","entry"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v25.8 (Yoast SEO v27.4) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>The New Gatekeepers: How proprietary algorithms increasingly determine the news we see - The Markaz Review<\/title>\n<meta name=\"description\" content=\"Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see\" \/>\n<meta property=\"og:description\" content=\"Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/\" \/>\n<meta property=\"og:site_name\" content=\"The Markaz Review\" \/>\n<meta property=\"article:published_time\" content=\"2021-03-14T23:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-08-21T06:58:56+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1400\" \/>\n\t<meta property=\"og:image:height\" content=\"931\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Andy Lee Roth\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Andy Lee Roth\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/\"},\"author\":{\"name\":\"Andy Lee Roth\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#\\\/schema\\\/person\\\/451a1c3fabd001d3bca969224fe24439\"},\"headline\":\"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see\",\"datePublished\":\"2021-03-14T23:00:00+00:00\",\"dateModified\":\"2025-08-21T06:58:56+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/\"},\"wordCount\":2384,\"commentCount\":3,\"publisher\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/thegatekeepersalibanisadr2010-1400pix.jpg\",\"keywords\":[\"Big Tech\",\"corporate media\",\"Facebook\",\"Google\",\"mainstream media\",\"press censorship\",\"Twitter\",\"YouTube\"],\"articleSection\":[\"Essays\",\"TMR 7 \u2022 TRUTH?\",\"TMR Issues\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/\",\"url\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/\",\"name\":\"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see - The Markaz Review\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/thegatekeepersalibanisadr2010-1400pix.jpg\",\"datePublished\":\"2021-03-14T23:00:00+00:00\",\"dateModified\":\"2025-08-21T06:58:56+00:00\",\"description\":\"Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#primaryimage\",\"url\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/thegatekeepersalibanisadr2010-1400pix.jpg\",\"contentUrl\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/thegatekeepersalibanisadr2010-1400pix.jpg\",\"width\":1400,\"height\":931,\"caption\":\"\u201cThe Gatekeepers\u201d by Ali Banisadr, (b. Tehran 1976, lives and works in New York), oil on linen, 72 x 108 inches (2010). Courtesy of the artist.\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/the-new-gatekeepers-andy-lee-roth\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#website\",\"url\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/\",\"name\":\"The Markaz Review\",\"description\":\"Literature and Arts from the Center of the World\",\"publisher\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#organization\",\"name\":\"The Markaz Review\",\"url\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-New-2023-TMR-Logo-500-pix.jpg\",\"contentUrl\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-New-2023-TMR-Logo-500-pix.jpg\",\"width\":473,\"height\":191,\"caption\":\"The Markaz Review\"},\"image\":{\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/#\\\/schema\\\/person\\\/451a1c3fabd001d3bca969224fe24439\",\"name\":\"Andy Lee Roth\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/030cd58935682e159fc2007581dd5d754125700a0f4fadb2e822a052ce2862f0?s=96&d=mm&r=g269e4ac0dc2eb1dc94142a9227e763d3\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/030cd58935682e159fc2007581dd5d754125700a0f4fadb2e822a052ce2862f0?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/030cd58935682e159fc2007581dd5d754125700a0f4fadb2e822a052ce2862f0?s=96&d=mm&r=g\",\"caption\":\"Andy Lee Roth\"},\"url\":\"https:\\\/\\\/themarkaz.org\\\/oldsite\\\/author\\\/andyleeroth\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see - The Markaz Review","description":"Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/","og_locale":"en_US","og_type":"article","og_title":"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see","og_description":"Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.","og_url":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/","og_site_name":"The Markaz Review","article_published_time":"2021-03-14T23:00:00+00:00","article_modified_time":"2025-08-21T06:58:56+00:00","og_image":[{"width":1400,"height":931,"url":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg","type":"image\/jpeg"}],"author":"Andy Lee Roth","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Andy Lee Roth","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#article","isPartOf":{"@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/"},"author":{"name":"Andy Lee Roth","@id":"https:\/\/themarkaz.org\/oldsite\/#\/schema\/person\/451a1c3fabd001d3bca969224fe24439"},"headline":"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see","datePublished":"2021-03-14T23:00:00+00:00","dateModified":"2025-08-21T06:58:56+00:00","mainEntityOfPage":{"@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/"},"wordCount":2384,"commentCount":3,"publisher":{"@id":"https:\/\/themarkaz.org\/oldsite\/#organization"},"image":{"@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#primaryimage"},"thumbnailUrl":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg","keywords":["Big Tech","corporate media","Facebook","Google","mainstream media","press censorship","Twitter","YouTube"],"articleSection":["Essays","TMR 7 \u2022 TRUTH?","TMR Issues"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/","url":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/","name":"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see - The Markaz Review","isPartOf":{"@id":"https:\/\/themarkaz.org\/oldsite\/#website"},"primaryImageOfPage":{"@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#primaryimage"},"image":{"@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#primaryimage"},"thumbnailUrl":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg","datePublished":"2021-03-14T23:00:00+00:00","dateModified":"2025-08-21T06:58:56+00:00","description":"Would you trust an algorithm to sell you a used car? Andy Lee Roth peers under the hood of Big Tech and finds plenty we should be worrying about.","breadcrumb":{"@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#primaryimage","url":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg","contentUrl":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg","width":1400,"height":931,"caption":"\u201cThe Gatekeepers\u201d by Ali Banisadr, (b. Tehran 1976, lives and works in New York), oil on linen, 72 x 108 inches (2010). Courtesy of the artist."},{"@type":"BreadcrumbList","@id":"https:\/\/themarkaz.org\/oldsite\/the-new-gatekeepers-andy-lee-roth\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/themarkaz.org\/oldsite\/"},{"@type":"ListItem","position":2,"name":"The New Gatekeepers: How proprietary algorithms increasingly determine the news we see"}]},{"@type":"WebSite","@id":"https:\/\/themarkaz.org\/oldsite\/#website","url":"https:\/\/themarkaz.org\/oldsite\/","name":"The Markaz Review","description":"Literature and Arts from the Center of the World","publisher":{"@id":"https:\/\/themarkaz.org\/oldsite\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/themarkaz.org\/oldsite\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/themarkaz.org\/oldsite\/#organization","name":"The Markaz Review","url":"https:\/\/themarkaz.org\/oldsite\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/themarkaz.org\/oldsite\/#\/schema\/logo\/image\/","url":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2023\/08\/cropped-New-2023-TMR-Logo-500-pix.jpg","contentUrl":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2023\/08\/cropped-New-2023-TMR-Logo-500-pix.jpg","width":473,"height":191,"caption":"The Markaz Review"},"image":{"@id":"https:\/\/themarkaz.org\/oldsite\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/themarkaz.org\/oldsite\/#\/schema\/person\/451a1c3fabd001d3bca969224fe24439","name":"Andy Lee Roth","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/030cd58935682e159fc2007581dd5d754125700a0f4fadb2e822a052ce2862f0?s=96&d=mm&r=g269e4ac0dc2eb1dc94142a9227e763d3","url":"https:\/\/secure.gravatar.com\/avatar\/030cd58935682e159fc2007581dd5d754125700a0f4fadb2e822a052ce2862f0?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/030cd58935682e159fc2007581dd5d754125700a0f4fadb2e822a052ce2862f0?s=96&d=mm&r=g","caption":"Andy Lee Roth"},"url":"https:\/\/themarkaz.org\/oldsite\/author\/andyleeroth\/"}]}},"jetpack_featured_media_url":"https:\/\/themarkaz.org\/oldsite\/wp-content\/uploads\/2021\/08\/thegatekeepersalibanisadr2010-1400pix.jpg","_links":{"self":[{"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/posts\/359","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/users\/38"}],"replies":[{"embeddable":true,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/comments?post=359"}],"version-history":[{"count":1,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/posts\/359\/revisions"}],"predecessor-version":[{"id":34029,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/posts\/359\/revisions\/34029"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/media\/3045"}],"wp:attachment":[{"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/media?parent=359"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/categories?post=359"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/tags?post=359"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/themarkaz.org\/oldsite\/wp-json\/wp\/v2\/coauthors?post=359"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}