
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Wed, 15 Apr 2026 22:51:09 GMT</lastBuildDate>
        <item>
            <title><![CDATA[A simpler path to a safer Internet: an update to our CSAM scanning tool]]></title>
            <link>https://blog.cloudflare.com/a-simpler-path-to-a-safer-internet-an-update-to-our-csam-scanning-tool/</link>
            <pubDate>Wed, 24 Sep 2025 14:00:00 GMT</pubDate>
            <description><![CDATA[ Cloudflare has made it even easier to enable our free child safety tooling for all customers. ]]></description>
            <content:encoded><![CDATA[ <p>Launching a website or an online community brings people together to create and share. The operators of these platforms, sadly, also have to navigate what happens when bad actors attempt to misuse those destinations to spread the most heinous content like child sexual abuse material (CSAM).</p><p>We are committed to helping anyone on the Internet protect their platform from this kind of misuse. We <a href="https://blog.cloudflare.com/the-csam-scanning-tool/"><u>first launched</u></a> a CSAM Scanning Tool several years ago to give any website on the Internet the ability to programmatically scan content uploaded to their platform for instances of CSAM in partnership with National Center for Missing and Exploited Children (NCMEC), Interpol, and dozens of other organizations committed to protecting children. That release took technology that was only available to the largest social media platforms and provided it to any website.</p><p>However, the tool we offered still required setup work that added friction to its adoption. To help our customers file reports to NCMEC, they needed to create their own credentials. That step of creating credentials and sharing them was too confusing or too much work for small site owners. We did our best helping them with secondary reports, but we needed a method that made this seamless to encourage adoption.</p><p>Today’s announcement makes that process significantly easier for site owners, helping them contribute to keeping the Internet safer with even less manual effort. The tool no longer requires website operators to create and provide their own unique NCMEC credentials. The result is that we have seen monthly adoption of the tool increase by 1,600% since the introduction of this change in February.</p>
    <div>
      <h3>How does it work?</h3>
      <a href="#how-does-it-work">
        
      </a>
    </div>
    <p>Services that attempt to flag and stop the spread of CSAM rely on partner organizations, like NCMEC, who maintain lists of hashes of known CSAM. These hashes are numerical representations of images that rely on an algorithm to create a kind of digital fingerprint for a photo. Partners who operate these tools, like Cloudflare, check hashes of content provided against the list maintained by organizations like NCMEC to see if there is a match. You can read about the operation in detail in our previous announcement <a href="https://blog.cloudflare.com/the-csam-scanning-tool/#finding-similar-images"><u>here</u></a>.</p><p>We rely on fuzzy hashing, a technique that goes beyond simple one-to-one matches. If a photo of CSAM is altered even slightly — by adding a filter, cropping it, or adding some noise — the fingerprint changes completely.</p><p>A fuzzy hash, on the other hand, creates a "perceptual fingerprint." Even if an image is modified, its fuzzy hash will remain similar to the original. This allows our tool to identify matches with a high degree of confidence, even if the abuser tries to disguise the content.</p><p>The removal of the requirement to share the credential with Cloudflare removes one additional step to deploying and enabling our tool, but site operators are still expected to continue to file their own reports with NCMEC or their regional equivalent.</p>
    <div>
      <h3>What is the process now?</h3>
      <a href="#what-is-the-process-now">
        
      </a>
    </div>
    <p>The process for using the tool is now straightforward and simple:</p><p><b>Enable the Tool:</b> Activate the CSAM Scanning Tool on your Cloudflare zone and verify your notification email address.</p><p><b>Scan and Detect: </b>Our tool scans your cached content for potential CSAM, creating a fuzzy hash of each image. If a match is found with a known bad hash, a detection event is created.</p><p><b>Remediate: </b>Cloudflare blocks the URL to any identified matches and notifies you so that you may take further action.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6cTjykOBheTnzbmcwjKoSI/63fb00a39807897c8b2feda9af373ec0/unnamed.png" />
          </figure>
    <div>
      <h3>What is next?</h3>
      <a href="#what-is-next">
        
      </a>
    </div>
    <p>We believe that the tools for a safer Internet should be available for everyone  — not just a few large companies.</p><p>We invite you to enable the CSAM Scanning Tool on your website today. For more technical details on how it works, please visit our <a href="https://developers.cloudflare.com/cache/reference/csam-scanning/"><u>developer documentation</u></a>. We also welcome you to join our community to discuss the technology and help us continue to build a better Internet.</p> ]]></content:encoded>
            <category><![CDATA[Birthday Week]]></category>
            <category><![CDATA[Trust & Safety]]></category>
            <category><![CDATA[Abuse]]></category>
            <category><![CDATA[Legal]]></category>
            <guid isPermaLink="false">4SD2BwOE3yemddmMT25cnO</guid>
            <dc:creator>Rachael Truong</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare’s 2024 Transparency Reports - now live with new data and a new format]]></title>
            <link>https://blog.cloudflare.com/cloudflare-2024-transparency-reports-now-live-with-new-data-and-a-new-format/</link>
            <pubDate>Fri, 28 Feb 2025 14:00:00 GMT</pubDate>
            <description><![CDATA[ Cloudflare’s 2024 Transparency Reports are now live — with new topics, new data points, and a new format, consistent with the EU’s Digital Services Act ]]></description>
            <content:encoded><![CDATA[ <p>Cloudflare’s 2024 <a href="https://www.cloudflare.com/transparency/"><u>Transparency Reports</u></a> are now live — with new topics, new data points, and a new format. For <a href="https://www.cloudflare.com/transparency/archive/"><u>over 10 years</u></a>, Cloudflare has published transparency reports twice a year in order to provide information to our customers, policymakers, and the public about how we handle legal requests and abuse reports relating to the websites using our services. Such transparency reporting is now recognized as a <a href="https://www.accessnow.org/campaign/transparency-reporting-index/"><u>best practice</u></a> among companies offering online services, and has even been written into law with the European Union’s Digital Service Act (DSA).</p><p>While Cloudflare has been publishing transparency reports for a long time, this year we chose to revamp the report in light of new reporting obligations under the DSA, and our goal of making our reports both comprehensive and easy to understand. Before you dive into the reports, learn more about Cloudflare’s longstanding commitment to transparency reporting and the key updates we made in this year’s reports.</p>
    <div>
      <h3>Cloudflare’s approach to transparency reporting</h3>
      <a href="#cloudflares-approach-to-transparency-reporting">
        
      </a>
    </div>
    <p>Cloudflare started issuing transparency reports early on, because we have long believed that transparency is essential to earning trust. In addition to sharing data about the number and nature of requests we receive, our transparency reports have provided a forum for Cloudflare to articulate the principles we apply in approaching <a href="https://www.cloudflare.com/trust-hub/law-enforcement/"><u>legal requests for customer information</u></a> and how we <a href="https://www.cloudflare.com/trust-hub/abuse-approach/"><u>handle abuse</u></a>.</p><p>Grounded in Cloudflare’s principles, our transparency reports have necessarily evolved over time as the scale and complexity of our services has grown. While our initial reports were focused on governmental requests for customer information, our reports have expanded to cover a broader set of issues, including civil requests for customer information, legal requests to limit or terminate services, and our process for handling reports of abuse on websites using our services.</p>
          <figure>
          <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7xcEb5PMZSvbk1Blkh7I1S/1694b584f1223a24d5aedde0065352ae/image2.png" />
          </figure>
    <div>
      <h3>The EU’s Digital Services Act</h3>
      <a href="#the-eus-digital-services-act">
        
      </a>
    </div>
    <p>A key driver of this year’s updates was the transparency reporting obligations in the <a href="https://blog.cloudflare.com/digital-services-act/"><u>EU’s Digital Services Act (DSA)</u></a>. As we have written about <a href="https://blog.cloudflare.com/digital-services-act/"><u>previously</u></a>, the DSA replaced a 20-year-old law called the e-Commerce Directive, providing an important framework for addressing the legal responsibilities of online service providers.</p><p>While the DSA addresses a number of topics, an important one is transparency. The DSA sets different transparency reporting obligations for different services, establishing baseline reporting requirements for all intermediary services, more detailed reporting for hosting services, and the most extensive reporting for online platforms like social media sites and search engines. Most of Cloudflare’s services are pass-through (intermediary) services related to security and performance with limited transparency reporting requirements under the DSA, while our hosting services have some additional requirements related to our abuse-related actions.</p><p>The DSA transparency obligations align with Cloudflare’s longstanding practices and company principles toward transparency. Because Cloudflare has always strived to provide meaningful transparency into its approach to these issues, we are well positioned to comply with the specific reporting obligations set forth in the DSA. That said, while we believe that our existing reports already satisfied much of the DSA, we identified changes we wanted to make to match specific types of data or formatting called for under the DSA. </p>
    <div>
      <h3>New data and a new format</h3>
      <a href="#new-data-and-a-new-format">
        
      </a>
    </div>
    <p>Our 2024 Transparency Reports include more information than ever before, all in a new format that we believe will make the information easier to understand.</p><p>Prompted by the DSA’s requirements and the continued expansion of services we offer, the 2024 reports includes new information, including additional categories of hosted content abuse, automated steps Cloudflare has taken to mitigate phishing and technical abuse, the mean time to take action on different types of abuse reports, and information about additional types of requests for customer information that we have received. You’ll find a machine-readable version of the data alongside our transparency reports, consistent with DSA requirements. We also introduced "additional context" boxes to call out trends or notable developments during the reporting period.</p><p>To try to make all of this information as digestible as possible, we divided our transparency report into two parts. Our report on Legal Requests for Information addresses the law enforcement, government, and civil requests for customer information that Cloudflare receives in the United States and around the world. Our report on Abuse Processes addresses Cloudflare’s processes for handling reports of abuse on websites using our services and our response to legal requests to terminate or restrict access to our users.</p><p>Because we divided the report into two parts, you’ll find our ‘<a href="https://blog.cloudflare.com/cloudflare-transparency-update-joining-cloudflares-flock-of-warrant-canaries-2/"><u>warrant canaries</u></a>’ on the <a href="https://www.cloudflare.com/transparency/"><u>transparency report landing page</u></a> of our <a href="https://www.cloudflare.com/trust-hub/"><u>Trust Hub</u></a> and no longer in the reports themselves. The warrant canary statements about things we have never done as a company are an essential part of our commitment to transparency in how we handle both customers’ information in response to legal requests and abuse reports. All of our warrant canaries remain intact, meaning we still haven't done any of these things.</p><p>We’ll continue to publish transparency reports twice a year, available on the <a href="https://www.cloudflare.com/transparency/"><u>Transparency page</u></a> of our website as well as through an <a href="https://www.cloudflare.com/transparency/rss.xml"><u>RSS feed</u></a>. Our approach to these reports will continue to evolve in order to provide meaningful transparency in line with our company principles, product portfolio growth, and in line with the new regulatory environment.</p> ]]></content:encoded>
            <category><![CDATA[Trust & Safety]]></category>
            <category><![CDATA[Transparency]]></category>
            <category><![CDATA[Policy & Legal]]></category>
            <guid isPermaLink="false">6r04i7Ke1lNGEWK4u3pRK1</guid>
            <dc:creator>Abby Vollmer</dc:creator>
            <dc:creator>Despina Papageorge</dc:creator>
        </item>
        <item>
            <title><![CDATA[First Half 2019 Transparency Report and an Update on a Warrant Canary]]></title>
            <link>https://blog.cloudflare.com/first-half-2019-transparency-report-and-an-update-on-a-warrant-canary/</link>
            <pubDate>Fri, 20 Dec 2019 21:49:36 GMT</pubDate>
            <description><![CDATA[ Today, we are releasing Cloudflare’s transparency report for the first half of 2019. We recognize the importance of keeping the reports current, but It’s taken us a little longer ]]></description>
            <content:encoded><![CDATA[ <p>Today, we are releasing <a href="https://www.cloudflare.com/transparency/">Cloudflare’s transparency report</a> for the first half of 2019. We recognize the importance of keeping the reports current, but It’s taken us a little longer than usual to put it together. We have a few notable updates.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/4xY1LkLltSH3mLIdmrOzEJ/d090e2f5d85f1dadc2ddd868242a6d58/canary-1.png" />
            
            </figure>
    <div>
      <h3>Pulling a warrant canary</h3>
      <a href="#pulling-a-warrant-canary">
        
      </a>
    </div>
    <p>Since we issued our very first transparency report in 2014, we’ve maintained a number of commitments - known as warrant canaries - about what actions we will take and how we will respond to certain types of law enforcement requests. We supplemented those initial commitments <a href="/cloudflare-transparency-update-joining-cloudflares-flock-of-warrant-canaries-2/">earlier this year</a>, so that our current warrant canaries state that Cloudflare has never:</p><ol><li><p>Turned over our encryption or authentication keys or our customers' encryption or authentication keys to anyone.</p></li><li><p>Installed any law enforcement software or equipment anywhere on our network.</p></li><li><p>Terminated a customer or taken down content due to political pressure*</p></li><li><p>Provided any law enforcement organization a feed of our customers' content transiting our network.</p></li><li><p>Modified customer content at the request of law enforcement or another third party.</p></li><li><p>Modified the intended destination of DNS responses at the request of law enforcement or another third party.</p></li><li><p>Weakened, compromised, or subverted any of its encryption at the request of law enforcement or another third party.</p></li></ol><p>These commitments serve as a statement of values to remind us what is important to us as a company, to convey not only what we do, but what we believe we should do. For us to maintain these commitments. We have to believe not only that we’ve met them in the past, but that we can continue to meet them.</p><p>Unfortunately, there is one warrant canary that no longer meets the test for remaining on our website. After Cloudlfare terminated the Daily Stormer’s service in 2017, Matthew <a href="/why-we-terminated-daily-stormer/">observed</a>:</p><p><i>"We're going to have a long debate internally about whether we need to remove the bullet about not terminating a customer due to political pressure. It's powerful to be able to say you've never done something. And, after today, make no mistake, it will be a little bit harder for us to argue against a government somewhere pressuring us into taking down a site they don't like."</i></p><p>We addressed this issue in our subsequent transparency reports by retaining the statement, but adding an asterisk identifying the Daily Stormer debate and the criticism that we had received in the wake of our decision to terminate services. Our goal was to signal that we remained committed to the principle that we should not terminate a customer due to political pressure, while not ignoring the termination. We also sought to be public about the termination and our reasons for the decision, ensuring that it would not go unnoticed.</p><p>Although that termination sparked significant debate about whether infrastructure companies making decisions about what content should remain online, we haven’t yet seen politically accountable actors put forth real alternatives to address deeply troubling content and behavior online. Since that time, we’ve seen even more real world consequences from the vitriol and hateful content spread online, from the screeds posted in connection with the terror attacks in Christchurch, Poway and El Paso to the posting of video glorifying those attacks. Indeed, in the absence of true public policy initiatives to address those concerns, the pressure on tech companies -- even deep Internet infrastructure companies like Cloudflare --  to make judgments about what stays online has only increased.  </p><p>In August 2019, Cloudflare terminated service to 8chan based on their failure to moderate their hate-filled platform in a way that inspired murderous acts. Although we don’t think removing cybersecurity services to force a site offline is the right public policy approach to the hate festering online, a site’s failure to take responsibility to prevent or mitigate the harm caused by its platform leaves service providers like us with few choices. We’ve come to recognize that the prolonged and persistent lawlessness of others might require action by those further down the technical stack. Although we’d prefer that governments recognize that need, and build mechanisms for due process, if they fail to act, infrastructure companies may be required to take action to prevent harm.</p><p>And that brings us back to our warrant canary. If we believe we might have an obligation to terminate customers, even in a limited number of cases, retaining a commitment that we will never terminate a customer “due to political pressure” is untenable. We could, in theory, argue that terminating a lawless customer like 8chan was not a termination “due to political pressure.” But that seems wrong. We shouldn’t be parsing specific words of our commitments to explain to people why we don’t believe we’ve violated the standard.</p><p>We remain committed to the principle that providing cybersecurity services to everyone, regardless of content, makes the Internet a better place. Although we’re removing the warrant canary from our website, we believe that to earn and maintain our users’ trust, we must be transparent about the actions we take. We therefore commit to reporting on any action that we take to terminate a user that could be viewed as a termination “due to political pressure.”</p>
    <div>
      <h3>UK/US Cloud agreement</h3>
      <a href="#uk-us-cloud-agreement">
        
      </a>
    </div>
    <p>As we’ve described <a href="/digital-evidence-across-borders-and-engagement-with-non-us-authorities/">previously</a>, governments have been working to find ways to improve law enforcement access to digital evidence across borders. Those efforts resulted in a new U.S. law, the Clarifying Lawful Overseas Use of Data (CLOUD) Act, premised on the idea that law enforcement around the world should be able to get access to electronic content related to their citizens when conducting law enforcement investigations, wherever that data is stored, as long as they are bound by sufficient procedural safeguards to ensure due process.</p><p>On October 3, 2019, the US and UK signed the first Executive Agreement under this law. According to the requirements of U.S. law, that Agreement will go into effect in 180 days, in March 2020, unless Congress takes action to block it. There is an ongoing debate as to whether the agreement includes sufficient due process and privacy protections. We’re going to take a wait and see approach, and will closely monitor any requests we receive after the agreement goes into effect.</p><p>For the time being, Cloudflare intends to comply with appropriately scoped and targeted requests for data from UK law enforcement, provided that those requests are consistent with the law and international human rights standards. Information about the legal requests that Cloudflare receives from non-U.S. governments pursuant to the CLOUD Act will be included in future transparency reports.</p> ]]></content:encoded>
            <category><![CDATA[Policy & Legal]]></category>
            <category><![CDATA[Trust & Safety]]></category>
            <category><![CDATA[Transparency]]></category>
            <category><![CDATA[Policy & Legal]]></category>
            <guid isPermaLink="false">26p8e8McNC9PBOC8HjH5ql</guid>
            <dc:creator>Alissa Starzak</dc:creator>
            <dc:creator>Justin Paine</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare’s Response to CSAM Online]]></title>
            <link>https://blog.cloudflare.com/cloudflares-response-to-csam-online/</link>
            <pubDate>Fri, 06 Dec 2019 14:06:00 GMT</pubDate>
            <description><![CDATA[ Responding to incidents of child sexual abuse material (CSAM) online has been a priority at Cloudflare from the beginning. The stories of CSAM victims are tragic, and bring to light an appalling corner of the Internet.  ]]></description>
            <content:encoded><![CDATA[ <p>Responding to incidents of child sexual abuse material (CSAM) online has been a priority at Cloudflare from the beginning. The stories of CSAM victims are tragic, and bring to light an appalling corner of the Internet. When it comes to CSAM, our position is simple: We don’t tolerate it. We abhor it. It’s a crime, and we do what we can to support the processes to identify and remove that content.</p><p>In 2010, within months of Cloudflare’s launch, we connected with the <a href="http://www.missingkids.com/">National Center for Missing and Exploited Children</a> (NCMEC) and started a collaborative process to understand our role and how we could cooperate with them. Over the years, we have been in regular communication with a number of government and advocacy groups to determine what Cloudflare should and can do to respond to reports about CSAM that we receive through our abuse process, or how we can provide information supporting investigations of websites using Cloudflare’s services.</p><p>Recently, <a href="https://twitter.com/mhkeller/status/1196818679683530752">36 tech companies</a>, including Cloudflare, received <a href="https://storage.googleapis.com/blog-cloudflare-com-assets/2019/12/senatorletter.pdf">this letter</a> from a group of U.S Senators asking for more information about how we handle CSAM content. The Senators referred to influential New York Times stories published in late September and early November that conveyed the disturbing number of images of child sexual abuse on the Internet, with graphic detail about the horrific photos and how the recirculation of imagery retraumatizes the victims. The stories focused on shortcomings and challenges in bringing violators to justice, as well as efforts, or lack thereof, by a group of tech companies including Amazon, Facebook, Google, Microsoft, and Dropbox, to eradicate as much of this material as possible through existing processes or new tools like PhotoDNA that could proactively identify CSAM material.  </p><p>We think it is important to share our response to the Senators (copied at the end of this blog post), talk publicly about what we’ve done in this space, and address what else we believe can be done.</p>
    <div>
      <h2>How Cloudflare Responds to CSAM</h2>
      <a href="#how-cloudflare-responds-to-csam">
        
      </a>
    </div>
    <p>From our work with NCMEC, we know that they are focused on doing everything they can to validate the legitimacy of CSAM reports and then work as quickly as possible to have website operators, platform moderators, or website hosts remove that content from the Internet. Even though Cloudflare is not in a position to remove content from the Internet for users of our core services, we have worked continually over the years to understand the best ways we can contribute to these efforts.</p>
    <div>
      <h3>Addressing  Reports</h3>
      <a href="#addressing-reports">
        
      </a>
    </div>
    <p>The first prong of Cloudflare’s response to CSAM is proper reporting of any allegation we receive. Every report we receive about content on a website using Cloudflare’s services filed under the “child pornography” category on our <a href="https://www.cloudflare.com/abuse/">abuse report page</a> leads to three actions:</p><ol><li><p>We forward the report to NCMEC. In addition to the content of the report made to Cloudflare, we provide NCMEC with information identifying the hosting provider of the website, contact information for that hosting provider, and the origin IP address where the content at issue can be located.</p></li><li><p>We forward the report to both the website operator and hosting provider so they can take steps to remove the content, and we provide the origin IP of where the content is located on the system so they can locate the content quickly. (Since 2017, we have given reporting parties the opportunity to file an anonymous report if they would prefer that either the host or the website operator not be informed of their identity).</p></li><li><p>We provide anyone who makes a report information about the identity of the hosting provider and contact information for the hosting provider in case they want to follow up directly.</p></li></ol><p>Since our founding, Cloudflare has forwarded 5,208 reports to NCMEC. Over the last three years, we have provided 1,111 reports in 2019 (to date), 1,417 in 2018, and 627 in 2017.  </p><p>Reports filed under the “child pornography” category account for about 0.2% of the abuse complaints Cloudflare receives. These reports are treated as the highest priority for our Trust &amp; Safety team and they are moved to the front of the abuse response queue. We are generally able to respond by filing the report with NCMEC and providing the additional information within a matter of minutes regardless of time of day or day of the week.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5gldDA1mBLXQkHbABJIv9e/71dbbefd701e70d89f2a98630ee47f6d/form-fill-report_2x.png" />
            
            </figure>
    <div>
      <h3>Requests for Information</h3>
      <a href="#requests-for-information">
        
      </a>
    </div>
    <p>The second main prong of our response to CSAM is operation of our “trusted  reporter” program to provide relevant information to support the investigations of nearly 60 child safety organizations around the world. The "trusted reporter" program was established in response to our ongoing work with these organizations and their requests for both information about the hosting provider of the websites at issue as well as information about the origin IP address of the content at issue. Origin IP information, which is generally sensitive security information because it would allow hackers to circumvent certain security protections for a website, like DDoS protections, is provided to these organizations through dedicated channels on an expedited basis.</p><p>Like NCMEC, these organizations are responsible for investigating reports of CSAM on websites or hosting providers operated out of their local jurisdictions, and they seek the resources to identify and contact those parties as quickly as possible to have them remove the content. Participants in the “trusted reporter” program include groups like the <a href="https://www.iwf.org.uk/">Internet Watch Foundation</a> (IWF), the <a href="https://www.inhope.org/">INHOPE Association</a>, the <a href="https://www.esafety.gov.au/">Australian eSafety Commission</a>, and <a href="https://www.meldpunt-kinderporno.nl/">Meldpunt</a>. Over the past five years, we have responded to more than 13,000 IWF requests, and more than 5,000 requests from Meldpunt. We respond to such requests on the same day, and usually within a couple of hours. In a similar way, Cloudflare also receives and responds to law enforcement requests for information as part of investigations related to CSAM or exploitation of a minor.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7waacDQO5qvlvRoG95aGeP/5460dea950b94dfde9ebab57e74f5f50/trusted-reporter_2x.png" />
            
            </figure><p>Among this group, the Canadian Centre for Child Protection has been engaged in a unique effort that is worthy of specific mention. The Centre’s <a href="https://www.cybertip.ca/app/en/">Cybertip</a> program operates their Project Arachnid initiative, a novel approach that employs an automated web crawler that proactively searches the Internet to identify images that match a known CSAM hash, and then alerts hosting providers when there is a match. Based on our ongoing work with Project Arachnid, we have responded to more than 86,000 reports by providing information about the hosting provider and provide the origin IP address, which we understand they use to contact that hosting provider directly with that report and any subsequent reports.</p><p>Although we typically process these reports within a matter of hours, we’ve heard from participants in our “trusted reporter” program that the non-instantaneous response from us causes friction in their systems. They want to be able to query our systems directly to get the hosting provider and origin IP  information, or better, be able to build extensions on their automated systems that could interface with the data in our systems to remove any delay whatsoever. This is particularly relevant for folks in the Canadian Centre’s Project Arachnid, who want to make our information a part of their automated system.  After scoping out this solution for a while, we’re now confident that we have a way forward and informed some trusted reporters in November that we will be making available an API that will allow them to obtain instantaneous information in response to their requests pursuant to their investigations. We expect this functionality to be online in the first quarter of 2020.</p>
    <div>
      <h3>Termination of Services</h3>
      <a href="#termination-of-services">
        
      </a>
    </div>
    <p>Cloudflare takes steps in appropriate circumstances to terminate its services from a site when it becomes clear that the site is dedicated to sharing CSAM or if the operators of the website and its host fail to take appropriate steps to take down CSAM content. In most circumstances, CSAM reports involve individual images that are posted on user generated content sites and are removed quickly by responsible website operators or hosting providers. In other circumstances, when operators or hosts fail to take action, Cloudflare is unable on its own to delete or remove the content but will take steps to terminate services to the  website.  We follow up on reports from NCMEC or other organizations when they report to us that they have completed their initial investigation and confirmed the legitimacy of the complaint, but have not been able to have the website operator or host take down the content. We also work with Interpol to identify and discontinue services from such sites they have determined have not taken steps to address CSAM.</p><p>Based upon these determinations and interactions, we have terminated service to 5,428 domains over the past 8 years.</p><p>In addition, Cloudflare has introduced new products where we do serve as the host of content, and we would be in a position to remove content from the Internet, including Cloudflare Stream and Cloudflare Workers.  Although these products have limited adoption to date, we expect their utilization will increase significantly over the next few years. Therefore, we will be conducting scans of the content that we host for users of these products using PhotoDNA (or similar tools) that make use of NCMEC’s image hash list. If flagged, we will remove that content immediately. We are working on that functionality now, and expect it will be in place in the first half of 2020.</p>
    <div>
      <h2>Part of an Organized Approach to Addressing CSAM</h2>
      <a href="#part-of-an-organized-approach-to-addressing-csam">
        
      </a>
    </div>
    <p>Cloudflare’s approach to addressing CSAM operates within a comprehensive legal and policy backdrop. Congress and the law enforcement and child protection communities have long collaborated on how best to combat the exploitation of children. Recognizing the importance of combating the online spread of CSAM, NCMEC first created the <a href="http://www.missingkids.org/gethelpnow/cybertipline">CyberTipline</a> in 1998, to provide a centralized reporting system for members of the public and online providers to report the exploitation of children online.</p><p>In 2006, Congress conducted a year-long <a href="https://www.govinfo.gov/content/pkg/CPRT-109HPRT31737/html/CPRT-109HPRT31737.htm">investigation</a> and then passed a number of laws to address the sexual abuse of children. Those laws attempted to calibrate the various interests at stake and coordinate the ways various parties should respond. The policy balance Congress struck on addressing CSAM on the Internet had a number of elements for online service providers.</p><p>First, Congress formalized NCMEC’s role as the central clearinghouse for reporting and investigation, through the CyberTipline. The law adds a <a href="https://uscode.house.gov/view.xhtml?req=18+USC+2258A&amp;f=treesort&amp;fq=true&amp;num=3&amp;hl=true&amp;edition=prelim&amp;granuleId=USC-prelim-title18-section2258A">requirement</a>, backed up by fines, for online providers to report any reports of CSAM to NCMEC. The law specifically notes that to preserve privacy, they were not creating a requirement to monitor content or affirmatively search or screen content to identify possible reports.</p><p>Second, Congress responded to the many stories of child victims who emphasized the continuous harm done by the transmission of imagery of their abuse. As described by <a href="http://www.missingkids.com/theissues/sexualabuseimagery">NCMEC</a>, “not only do these images and videos document victims’ exploitation and abuse, but when these files are shared across the internet, child victims suffer re-victimization each time the image of their sexual abuse is viewed” even when viewed for ostensibly legitimate investigative purposes. To help address this concern, the law <a href="https://uscode.house.gov/view.xhtml?hl=false&amp;edition=prelim&amp;path=&amp;req=granuleid%3AUSC-prelim-title18-section2258B&amp;f=treesort&amp;fq=true&amp;num=0&amp;saved=%7CMTggVVNDIDIyNThB%7CdHJlZXNvcnQ%3D%7CdHJ1ZQ%3D%3D%7C3%7Ctrue%7Cprelim">directs</a> providers to minimize the number of employees provided access to any visual depiction of child sexual abuse.  </p><p>Finally, to ensure that child safety and law enforcement organizations had the records necessary to conduct an investigation, the law <a href="https://uscode.house.gov/view.xhtml?hl=false&amp;edition=prelim&amp;path=&amp;req=granuleid%3AUSC-prelim-title18-section2258A&amp;f=treesort&amp;fq=true&amp;num=0&amp;saved=%7CMTggVVNDIDIyNThB%7CdHJlZXNvcnQ%3D%7CdHJ1ZQ%3D%3D%7C3%7Ctrue%7Cprelim">directs</a> providers to preserve not only the report to NCMEC, but also “any visual depictions, data, or other digital files that are reasonably accessible and may provide context or additional information about the reported material or person” for a period of 90 days.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/3NhDSyBzr2XtU8bm1NSCuL/8dd8e1d02dd9d3c3f04ebf088d25676e/stats-_2x.png" />
            
            </figure><p>Because Cloudflare’s services are used so extensively—by more than 20 million Internet properties, and based on <a href="https://w3techs.com/technologies/history_overview/proxy/all/q">data from W3Techs</a>, more than 10% of the world’s top 10 million websites—we have worked hard to understand these policy principles in order to respond appropriately in a broad variety of circumstances. The processes described in this blogpost were designed to make sure that we comply with these principles, as completely and quickly as possible, and take other steps to support the system’s underlying goals.</p>
    <div>
      <h2>Conclusion</h2>
      <a href="#conclusion">
        
      </a>
    </div>
    <p>We are under no illusion that our work in this space is done. We will continue to work with groups that are dedicated to fighting this abhorrent crime and provide tools to more quickly get them information to take CSAM content down and investigate the criminals who create and distribute it.</p><p><a href="https://storage.googleapis.com/blog-cloudflare-com-assets/2019/12/cloudflareresponse.pdf"><b>Cloudflare's Senate Response (PDF)</b></a></p><p><a href="https://www.scribd.com/document/438491024/Cloudflare-s-Senate-Response#from_embed">Cloudflare's Senate Res...</a> by <a href="https://www.scribd.com/user/490411479/Cloudflare#from_embed">Cloudflare</a> on Scribd</p> ]]></content:encoded>
            <category><![CDATA[Policy & Legal]]></category>
            <category><![CDATA[Community]]></category>
            <category><![CDATA[Trust & Safety]]></category>
            <guid isPermaLink="false">5Not63XGszOE0baXeqaEzN</guid>
            <dc:creator>Doug Kramer</dc:creator>
            <dc:creator>Justin Paine</dc:creator>
        </item>
        <item>
            <title><![CDATA[Cloudflare Transparency Update: Joining Cloudflare’s Flock of (Warrant) Canaries]]></title>
            <link>https://blog.cloudflare.com/cloudflare-transparency-update-joining-cloudflares-flock-of-warrant-canaries-2/</link>
            <pubDate>Mon, 25 Feb 2019 14:00:00 GMT</pubDate>
            <description><![CDATA[ Today, Cloudflare is releasing its transparency report for the second half of 2018. We have been publishing biannual Transparency Reports since 2013. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Today, Cloudflare is releasing its <a href="https://www.cloudflare.com/transparency/updates/">transparency report</a> for the second half of 2018. We have been <a href="https://www.cloudflare.com/transparency/">publishing</a> biannual Transparency Reports since 2013.</p><p>We believe an essential part of earning the trust of our customers is being transparent about our features and services, what we do – and do not do – with our users’ data, and generally how we conduct ourselves in our engagement with third parties such as law enforcement authorities.  We also think that an important part of being fully transparent is being rigorously consistent and anticipating future circumstances, so our users not only know how we have behaved in the past, but are able to anticipate with reasonable certainty how we will act in the future, even in difficult cases.</p><p>As part of that effort, we have set forth certain ‘warrant canaries’ – statements of things we have never done as a company. As described in greater detail below, the report published today adds three new ‘warrant canaries’, which is the first time we’ve added to that list since 2013. The transparency report is also distinguished because it adds new reporting on requests for user information from foreign law enforcement, and requests for user information that we receive from government agencies that are not part of law enforcement.</p><p>This is the first in a series of blog posts this week that will describe our process and the commitments we make in relation to the handling of user data and abuse queries, our interactions with the law enforcement and the security communities, and our essential red-lines when it comes to how we operate as a company. The specific updates will include:</p><ul><li><p>Monday: This blogpost on the updated transparency report and new warrant canaries.</p></li><li><p>Tuesday: An updated discussion about how we address requests for content moderation</p></li><li><p>Wednesday: How we plan to deal with abuse of new products</p></li><li><p>Thursday: Dealing with requests from non-US law enforcement</p></li></ul><p>This is an exciting time of growth for Cloudflare and we are only just getting started, so we do expect more complexity over the years. However, the fundamentals remain for us, always - transparency, due process, openness, integrity and a commitment to improving the Internet for all. We are excited to share more with you this week!</p>
    <div>
      <h3>New Warrant Canaries</h3>
      <a href="#new-warrant-canaries">
        
      </a>
    </div>
    <p>From the beginning, and consistent with our mission of “helping build a better Internet,” Cloudflare has relied on a set of values that inform how we work with our customers, with law enforcement, and with other third parties. Maintaining the privacy and trust of our users and supporting a secure, well-functioning, and content-neutral Internet is essential to us.</p><p>It’s not enough for us to be transparent about the things we do willingly, because tech companies are pressured every day to take the easy way out and avoid controversy or conflict by doing seemingly small things easily and quietly that are corrosive to these values. So, for many years, we have published a list of “things we have never done” in our transparency report to demonstrate our commitment to these values.</p><p>The rationale behind including “warrant canaries” in our transparency report is twofold. On one hand, if Cloudflare is asked by law enforcement or a third party to act against one of the warrant canaries and not disclose it publicly, we will still have to remove it from our list. The removal of the warrant canary, like the silence of a canary in the coal mine, will signal to our customers that something is not right. And in addition, these statements serve as a signal to groups which may ask us to take actions contravening our values that such actions are not so easy for us to take. We have said before and re-commit here: if Cloudflare were asked to take an action violating one of the warrant canaries, we would pursue legal remedies challenging the request in order to protect our customers from what we believe are improper, illegal, or unconstitutional requests.</p>
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2xOkIYGjQYv3DaGruYxMAS/17c2644547861ee34c7a4840c1514f68/canary-1.png" />
            
            </figure>
    <div>
      <h3>Why add new warrant canaries?</h3>
      <a href="#why-add-new-warrant-canaries">
        
      </a>
    </div>
    <p>We have not added warrant canaries since we put out our first transparency report in 2013. The original canaries are as follows:</p><ul><li><p>Cloudflare has never turned over our SSL keys or our customers SSL keys to anyone.</p></li><li><p>Cloudflare has never installed any law enforcement software or equipment anywhere on our network.</p></li><li><p>Cloudflare has never terminated a customer or taken down content due to political pressure.</p></li><li><p>Cloudflare has never provided any law enforcement organization a feed of our customers' content transiting our network.</p></li></ul><p>So, why change that this year? Though the company develops new products each year, the addition of new types of services in 2018, notably Cloudflare Workers and DNS Resolver 1.1.1.1, expanded our capabilities in a way that we believe is worth addressing. Similarly, regulation of technology has been changing globally, and we feel it is pertinent to respond to these developments.</p><p>The new canaries, and the issues they are intended to address, are outlined below.  To be clear, we haven’t necessarily received law enforcement requests to do any of these things at this point.  We just want to make sure we lay out our commitments as clearly as possible before we get a request.</p>
    <div>
      <h3>The new canaries</h3>
      <a href="#the-new-canaries">
        
      </a>
    </div>
    <p><b>Cloudflare has never modified customer content at the request of law enforcement or another third party.</b></p><p>The Internet has come a long way since the early days when every visitor to a website saw precisely the same content. Cookies and other techniques allow developers to customize the user experience. In the last year and a half, Cloudflare launched Workers, which allows website developers to customize their websites using edge side code. Using Workers, our customers can do things like customizing their websites, serving different versions of their website to different types of visitors or to those in different locations. Although being able to alter the version of a website particular visitors see or what application runs for different visitors is a powerful new tool for our customers, we recognize that it also holds the potential for mischief and abuse. Governments or malicious actors could in theory use edge-side code to modify the content of a website, make changes only for particular viewers, or collect information about the visitors to a site.</p><p>We believe that only those who are empowered to change the site itself should be empowered to make changes by running code at the edge. We will therefore fight requests to make modifications, either by adding apps or modifying content, at the request of a third party without the customer’s consent.</p><p><b>Cloudflare has never modified the intended destination of DNS responses at the request of law enforcement or another third party.</b></p><p>The privacy and security of DNS Resolver 1.1.1.1 are very important to us, and were front of mind when designing the service, as described <a href="/announcing-1111/">here</a>. At Cloudflare we believe that part of helping to build a better Internet is to ensure that users are routed to the website they intend to visit.</p><p>DNS spoofing, or cache poisoning, exploits the functioning of DNS resolvers in order to route unsuspecting visitors incorrectly. If we think of DNS as the phonebook of the Internet, DNS spoofing is similar to someone taking new phonebooks from people’s doors and replacing them with fakes. In this new copy, the attacker has changed ordinary people’s numbers to the numbers of phone scammers. When a user with one of the affected books looks up and calls the number of, say, a landscaping service, or even a friend, they end up dialing a scammer instead. In DNS spoofing, a person looking up an affected website would be directed to a fake website, or somewhere different entirely, rather than the intended destination.</p><p>We saw a concrete example of this type of DNS spoofing earlier this month. On February 10, 2019, Venezuelan opposition leader Juan Guaido asked Venezuelans to volunteer to help international humanitarian organizations deliver aid into the country. A day after this public announcement, however, a similarly named website was set up, and users in Venezuela trying to visit the original and official website were redirected -- using DNS spoofing -- to the fake website. The fake website had a form to register personal data, such as name, email and cell phone.</p><p>According to <a href="https://motherboard.vice.com/en_us/article/d3mdxm/venezuela-government-hack-activists-phishing">Motherboard</a>:</p><blockquote><p>While studying the fake website, researchers found phishing sites hosted on the same IP address. And there’s evidence that the people behind the second, apparently fake and malicious, website were working for the <a href="https://www.nytimes.com/2019/01/23/world/americas/venezuela-protests-guaido-maduro.html"><b>government</b></a> of Maduro, according to security firm CrowdStrike and independent researchers.</p></blockquote><blockquote><p>“It’s clearly the work of the Venezuelan government trying to identify the people working against them, so that they can put a stop to it,” Adam Meyers, the vice president of intelligence at CrowdStrike, a firm that’s analyzed the attacks, told Motherboard in a phone call.</p></blockquote><p>This type of DNS spoofing can be done for any number of purposes, from gaining sensitive information to preventing access to websites with controversial content. Making a commitment not to modify the intended destination of DNS responses at the request of law enforcement or a third party is an affirmation of our desire to ensure the reliability of 1.1.1.1 and do our best to maintain confidence in the DNS and Internet infrastructure more generally.</p><p>Occasionally, law enforcement uses Cloudflare for domains they have seized from <a href="https://www.cloudflare.com/learning/dns/glossary/what-is-a-domain-name-registrar/">domain registrars</a> using legal process. Because law enforcement has obtained legal control of the website in those circumstances (through seizure), that service does not involve modification of DNS responses.</p><p><b>Cloudflare has never weakened, compromised, or subverted any of its encryption at the request of law enforcement or another third party.</b></p><p>We believe encryption is critical to a trustworthy and secure Internet. Encryption prevents the theft of private data, making it safer to bank, shop, and communicate online.</p><p>Because of the importance of encryption to the Internet ecosystem, we have a team constantly working on new ways to increase encryption on the Internet, whether that means providing <a href="https://www.cloudflare.com/application-services/products/ssl/">SSL certificates for free</a> to all our users, <a href="/esni/">pioneering eSNI</a> or supporting <a href="/dns-resolver-1-1-1-1/">DNS over TLS and DNS over HTTPS</a> on 1.1.1.1.</p><p>Because encryption can complicate efforts to obtain access to digital evidence, however, law enforcement agencies have pushed for tools to gain access to encrypted material. These efforts range from the FBI’s attempt to get a court order to require Apple to assist them in obtaining encrypted data from an iPhone in February 2015, to Australia’s new Assistance and Access law, passed last fall. We’re concerned that these types of efforts will raise questions about the security of encryption products. As one Cloudflare employee put it after Australia’s law passed, “tech companies now have to do code reviews of everything coming out of Australia” to ensure there are no vulnerabilities.</p><p>We added the new commitment to prevent this uncertainty. Our intent is to continue focusing on ways to improve current encryption methods and deployment of these methods, not weaken them.</p><p><b>Cloudflare has never turned over our encryption or authentication keys or our customers' encryption or authentication keys to anyone.</b></p><p>This is a slight modification to a previous commitment.  The wording previously referred to “SSL keys” rather than “encryption and authentication keys.” Given the deprecation of SSL, we wanted to be absolutely clear that we were referring to all encryption and authentication keys, not just those from a deprecated security protocol.</p><p>Our goal in modifying this canary is to provide additional security for our customers. We therefore believe it makes sense to distill the language to encompass the crux of what we will not do, which is provide our customers’ keys to third parties.</p> ]]></content:encoded>
            <category><![CDATA[Transparency]]></category>
            <category><![CDATA[Trust & Safety]]></category>
            <category><![CDATA[Policy & Legal]]></category>
            <guid isPermaLink="false">1fwUBKWTTfPKSqz9W3e3kR</guid>
            <dc:creator>Alissa Starzak</dc:creator>
            <dc:creator>Justin Paine</dc:creator>
            <dc:creator>Erin Walk</dc:creator>
        </item>
    </channel>
</rss>