Here is your question: Is it more important for companies to be good at Data Governance or Data Management? Why?
GOVERNMENT The U.S. Needs a New Paradigm for Data Governance by Maya Uppaluru APRIL 16, 2018 dave wheeler for hbr The U.S. Senate and House hearings last week on Facebook’s use of data and foreign interference in the U.S. election raised important challenges concerning data privacy, security, ethics, transparency, and responsibility. They also illuminated what could become a vast chasm between traditional privacy and security laws and regulations and rapidly evolving internet-related business models and 2COPYRIGHT © 2018 HARVARD BUSINESS SCHOOL PUBLISHING CORPORATION. ALL RIGHTS RESERVED. https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/?utm_term=.bd447d6f5663 https://www.washingtonpost.com/news/the-switch/wp/2018/04/11/transcript-of-zuckerbergs-appearance-before-house-committee/?utm_term=.c7516d6dd4a4 activities. To help close this gap, technologists need to seriously reevaluate their relationship with government. Here are four ways to start. Help to increase tech literacy in Washington. Lawmakers expressed surprise and confusion about Facebook’s business model, including how the company generates revenue and uses data for targeted advertising. They also seemed to misunderstand how Facebook functions as a platform for third-party applications and how users’ data is flowing between the user, Facebook, and third parties. This lack of knowledge — despite the millions of dollars that the tech industry spent on lobbying Washington in 2017 — shows that technology literacy among lawmakers still needs to be improved. It’s especially important since Washington is clearly interested in enhanced regulation of the data economy. Multiple lawmakers suggested more expansive federal privacy legislation. There are state efforts as well, such as the California Consumer Privacy Act. The question is whether regulators will be able to create rules that reflect modern business models and data flows and support innovative services and products. To ensure that they do, internet and technology companies must engage with legislators in a different way, beyond the transactional and fleeting. To accomplish this, those who care about the future of technology must be more strategic and focused on educating the federal government and changing its culture over the long term. A few examples of how to embed tech expertise into policy making are TechCongress, the U.S. Digital Service, and the Presidential Innovation Fellows. These programs offer an extended opportunity for people with different experiences and backgrounds to work together to improve how government works — increasing their exposure to different ideas, forcing collaboration, and working through tension and conflicting viewpoints. And when those technologists finish their public service tour of duty and return to industry, they will go back with a deeper understanding of public service. Create and enforce stronger policies for governing third parties’ use of data. Much of the internet economy runs on application programming interfaces (APIs). Companies such as Facebook, Google, Apple, Amazon, and Salesforce have robust developer programs that encourage third-party integration with their platforms via APIs. These partnerships make it possible to offer customers valuable services — for example, add-on applications that allow you to customize your Gmail experience. However, these partnerships can also lead to data spreading to places that are far beyond users’ awareness or control. As many commenters have pointed out, the Facebook and Cambridge Analytica story is about how these platforms in partnership with third-party applications can use and misuse our data in ways that many of us did not know was possible. In order for industry — and users — to realize the full benefit of these types of collaborations, information access by third parties must be accompanied by strong API policies and transparent business practices. This includes responsible vetting of third-party applications, clear policies around 3COPYRIGHT © 2018 HARVARD BUSINESS SCHOOL PUBLISHING CORPORATION. ALL RIGHTS RESERVED. https://www.engadget.com/2018/04/13/senators-privacy-legislation-zuckerberg-hearings/ https://www.caprivacy.org/ https://www.techcongress.io/ https://www.usds.gov/ https://www.usds.gov/ https://presidentialinnovationfellows.gov/ https://slate.com/technology/2018/03/the-real-scandal-isnt-cambridge-analytica-its-facebooks-whole-business-model.html what third parties can and cannot do with user data, and dedicated resources and processes for monitoring and enforcing the policies. Finally, companies must also be proactive in notifying consumers when violations occur and taking timely action when they are. Mark Zuckerberg, Facebook’s CEO, admitted last week that his team knew about the improper use of 87 million people’s data as far back as December of 2015, yet did not disclose this to their users. Timely notification is critically important to establish trust with users. Invest in user-centered models for consent and terms of service. Several lawmakers questioned the ability of Facebook’s users to truly read and comprehend the company’s terms of service. It’s a big problem that extends way beyond Facebook. Most consumer internet services and products are designed in ways that encourage consumers to quickly click through long terms of service and legal policies in order to move on and use the app or website they are trying to access. A new consent model is needed for the data economy, one that is designed around the user’s experience and will actually help people understand how their data is being used. This means less legalese and clearer, simpler design and language that have been tested and demonstrated to convey meaning in the same way a website’s primary services are carefully designed — allowing an “opt in” to be truly meaningful rather than just another checkbox. Some pioneering efforts in this area have been made to improve electronic consent in the clinical research space. One example is using images in consent forms, which can slow down a reader’s eye and focus their attention. Another is allowing users to select the level of detail they want for any given consent provision, so they can learn more about the topics they care about most. Future federal policy will likely focus on stronger consent regulations. The Federal Trade Commission has pushed for “just-in-time” disclosures to consumers to obtain their affirmative express consent, and the European Union’s General Data Protection Regulation (GDPR) places great importance on consent. Yet usability and user-friendly software design are difficult to mandate by legislation. That should be the responsibility of tech companies, which already have robust user- centered design teams and clearly know how to make their services engaging (and possibly even addictive). Include data ethics as a central component of any regulatory reform. As Zuckerberg said on April 10, he considers Facebook to be a content-neutral platform for users to share ideas and opinions freely. Yet many lawmakers expressed concerns over extremist views on the site, such as hate speech and terrorist propaganda, and the proliferation of false information to influence elections around the world. And while Zuckerberg frequently described his company’s efforts to use artificial intelligence (AI) to monitor Facebook’s content and purge material that violates Facebook’s policies, it is also true that the effects of using AI, whether intentionally or not, can be to perpetuate or exacerbate biases. 4COPYRIGHT © 2018 HARVARD BUSINESS SCHOOL PUBLISHING CORPORATION. ALL RIGHTS RESERVED. https://techcrunch.com/2018/04/10/sen-harris-puts-zuckerberg-between-a-rock-and-a-hard-place-for-not-disclosing-data-misuse/ https://tosdr.org/ https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769129 https://www.ftc.gov/sites/default/files/documents/reports/mobile-privacy-disclosures-building-trust-through-transparency-federal-trade-commission-staff-report/130201mobileprivacyreport.pdf https://iapp.org/media/pdf/resource_center/UX_FINAL.pdf https://iapp.org/media/pdf/resource_center/UX_FINAL.pdf https://www.jwatch.org/na45772/2018/01/03/how-addictive-social-media https://www.jwatch.org/na45772/2018/01/03/how-addictive-social-media https://www.washingtonpost.com/news/the-switch/wp/2018/04/10/transcript-of-mark-zuckerbergs-senate-hearing/?utm_term=.c356545efe3c https://www.washingtonpost.com/news/the-switch/wp/2018/04/11/transcript-of-zuckerbergs-appearance-before-house-committee/?utm_term=.2b5bc9693852 https://obamawhitehouse.archives.gov/blog/2016/05/04/big-risks-big-opportunities-intersection-big-data-and-civil-rights Given that future hearings and legislative and regulatory activity may center around regulation of algorithms that power AI technology, the data science community and policy makers must work together to identify the principles and rules of the road to guide efforts to address this continuously evolving challenge. One great example of this ongoing work is a collaboration between Data for Democracy and Bloomberg on a data science code of ethics. The congressional hearings represent a fascinating milestone in the evolution of the tech industry and its relationship with regulation. While Facebook represents a unique case study, the challenges and opportunities amplified in the congressional hearings are pervasive across the digital economy. This includes an entire sector of data brokers who move more and more consumer data every day and are rarely in the public eye in the same way that Facebook was last week. It seems inevitable that the federal government will enact stronger privacy and consent safeguards, as the European Union has with GDPR. Instead of reacting to or resisting such efforts, tech companies must proactively work with governments and acknowledge that along with their increasingly greater power comes greater responsibility. Maya Uppaluru is an associate in Crowell & Moring’s Washington, D.C., office, where she is a member of the Digital Health Practice and Health Care Group. She previously was a policy advisor in the White House Office of Science and Technology Policy and also served in policy and innovation roles at the Office of the National Coordinator for Health IT and the Federal Communications Commission. 5COPYRIGHT © 2018 HARVARD BUSINESS SCHOOL PUBLISHING CORPORATION. ALL RIGHTS RESERVED. http://datafordemocracy.org/projects/ethics.html https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf https://www.eugdpr.org/ Copyright 2018 Harvard Business Publishing. All Rights Reserved. Additional restrictions may apply including the use of this content as assigned course material. Please consult your institution's librarian about any restrictions that might apply under the license with your institution. For more information and teaching resources from Harvard Business Publishing including Harvard Business School Cases, eLearning products, and business simulations please visit hbsp.harvard.edu. Designing data governance that delivers value Follow these principles to shift from a data-governance model of loosely followed guidelines to one that makes the most of digital and analytics. June 2020 by Bryan Petzold, Matthias Roggendorf, Kayvaun Rowshankish, and Christoph Sporleder © Getty Images Technology Executives in every industry know that data is important. Without it, there can be no digital transformation to propel the organization past competitors. There are no analytics driving new sources of revenue. Even running the basic business well isn’t possible. But for data to fuel these initiatives, it must be readily available, of high quality, and relevant. Good data governance ensures data has these attributes, which enable it to create value. The problem is that most governance programs today are ineffective. The issue frequently starts at the top, with a C-suite that doesn’t recognize the value-creation potential in data governance. As a result, it becomes a set of policies and guidance relegated to a support function executed by IT and not widely followed—rendering the initiatives that data powers equally ineffective. In other cases, organizations try to use technology to solve the problem. While technology solutions such as data lakes and data-governance platforms can help, they aren’t a panacea. Without quality-assuring governance, companies not only miss out on data-driven opportunities; they waste resources. Data processing and cleanup can consume more than half of an analytics team’s time, including that of highly paid data scientists, which limits scalability and frustrates employees. Indeed, the productivity of employees across the organization can suffer: respondents to our 2019 Global Data Transformation Survey reported that an average of 30 percent of their total enterprise time was spent on non-value-added tasks because of poor data quality and availability (Exhibit 1). While it’s challenging to directly attribute value to data governance, there are multiple examples of its significant indirect value. Leading firms have eliminated millions of