Ethical Resolve Blog

Getting the formula right: Social trust, A/B testing and research ethics

Image courtesy of Flickr user x6e38 under CC license

Image courtesy of Flickr user x6e38 under CC license

Most Internet services, and especially social media services, routinely conduct experiments on users’ experiences even though few of us are aware of it and consent procedures are murky. In a recent New York Times op-ed Michelle Meyer and Christopher Chabris argue that we should enthusiastically embrace the model of experimentation on users called “A/B testing.” This type of data-intensive experimentation is the bread and butter of the Internet economy and now is at the heart of sprawling ethical dispute over whether experimenting on Internet users’ data is equivalent to human experimentation on legal, ethical or regulatory grounds. In their Op-Ed, Myer and Chabris argue that A/B testing is on the whole ethical because without it Internet services would have no idea about what works, let alone what works best. They suggest that whatever outrage users might feel about such experiments are due to a “moral illusion” wherein we are prone to assuming that the status quo is natural and any experimental changes need to be justified and regulated, but the reality of Internet services is that there is no non-experimental state.


While they’re right that this type of experimentation is a poor fit for the ways we currently regulate research ethics, they fall short of explaining that data scientists need to earn the social trust that is the foundation of ethical research in any field. Ultimately, the foundations of ethical research are about trusting social relationships, not our assumptions about how experiments are constituted. This is a critical moment for data-driven enterprises to get creative and thoughtful about building such trust.

Even if those specific regulations do not work for A/B testing, it does not follow that fostering and maintaining such trust is not an essential component of knowledge production in the era of big data.


A/B testing is the process of dividing users randomly into two groups and comparing their response to different user experiences in order to determine which experience is “better.” Whichever website design or feed algorithm best achieves the prefered outcome—such as increased sales, regular feed refreshes, more accurate media recommendations, etc.—will become the default user experience. A/B testing appears innocuous enough when a company is looking for hard data about which tweaks to a website design drives sales, such as the color of the “buy” button. Few would argue that testing the color of a buy button or placement of an ad requires the informed consent of every visitor to a website. However, when a company possesses (or is accessing via data mining) a vast store of data on your life, your political preferences, your daily activities, your calendar, your personal networks, and your location, A/B testing takes on a different flavor.

Read More

How is the Drone Industry Handling Privacy?

When it comes to privacy, the drone industry is not clear for takeoff.

Despite being a relatively small event, the Drones, Data X Conference in Santa Cruz last month painted a large canvas of the current state of affairs in the world of UAV technology. The audience heard from industry leaders in business and government, as well as being able to interact with a large number of drone enthusiasts and hobbyists. These two groups have completely different views of the world of drone flight.

From the perspective of hobbyists like Ryan Jay, the world of the drone hobbyist is a bit like the wild west: there are FAA regulations “that have no teeth” to control his activities in drone flight. He is a one man flight team, able to build, launch and pilot his own UAVs using First Person View (FPV) by means of a camera mounted on his vehicle. Ryan has built (and lost) several vehicles over the past few years as a hobbyist, and he doesn’t see his ability to pursue this hobby being meaningfully limited by the long conversations going on the in the drone industry between NASA, the FAA and drone hardware and software makers.

The view from above 1,200 feet is different. In an industry with investment already in the billions there is no shortage of careful thinking going into answering questions about how drones should be regulated to protect privacy and security. Whereas the private drone hobbyist can do everything herself, using drones for commercial purposes is highly visible and highly regulated. The billions invested in the drone industry have not been spent to fulfill the desires of hobbyists: there is massive ROI projected for companies who are able to leverage drones for purposes that are currently done inefficiently by other means.

Whether we are talking about aerial inspection of powerlines, flare stacks, wind farms, oil pipelines, and solar arrays, surveying of forests, mines, quarries and agricultural resources, use of drones for disasters relief, search and rescue, and delivery of medical supplies to remote areas, drones are implicated as the tool that makes the impossible possible. There is a recurring narrative in this drone community about the power of aerial vision: the phrase “god’s eye view” of the world was repeated so many times over the course of the day it began to make me uncomfortable. Romeo Durscher, Director of Education for DJI (the largest drone maker in the world) used the phrase more than once in his talk saying at one point “The God’s eye view is really my favorite view.”

Read More

Proud sponsor of the 2015 Bonny Doon Art & Wine Festival

Proud-Sponsor-A&WnonHTMLEthical Resolve is proud to be a member of the Bonny Doon community, and we are very happy that we can support the art, music and science programs at Bonny Doon Elementary as a sponsor of the Bonny Doon Art & Wine Festival. The funds raised through this event provides the bulk of the budget our little school system needs to support dedicated art and science teachers. Ethical Resolve believes in the value of a diverse and well-rounded education for all children. Come meet us at the festival on June 6, 2015 and enjoy some great wine, food and redwoods!

Beyond Privacy

This series of blog posts will address existing and emergent concerns about ethics in the education technology market. The goal of this series is to help technology entrepreneurs begin to think through and respond to the risks their companies face with regard to ethical challenges. Failure to develop an effective and clear ethics policy has destroyed multi-million dollar companies, and will do so again in the future. Too often, entrepreneurs think that they can wait to respond to ethical concerns until after their product has become successful. We have observed over and over again that the most successful products are designed with the ethical use of that product in mind. These posts will encourage readers to consider how successes and failures in the technology market can be understood in terms of ethical approach.


Data Ethics in Education Technology

How data is used in education technology can determine the success or failure of a company. By looking at the history of the ed tech market we can learn how good ethics forms the foundation of successful ed tech companies.


The debate about the role of big data in education has sometimes been cast as a dispute between reformers and traditionalists. Reformers are described as those who support technology as the solution to the problems of the education system, whereas traditionalists tend to support teacher pay increases, reduced class sizes, etc. as the better approach. This debate has often conflated multiple ethical questions within education technology under the same heading: concerns about big data. Disentangling the concerns surrounding the processes of education data is mandatory if companies are to effectively formulate and communicate sound ethical data use policies to their current and potential users. When it comes to ed tech, good intentions and high hopes for digital tools are not enough—parents, teachers and students need concrete evidence that the providers are not merely interested in the bottom line.


Fully understanding what’s at stake requires that we distinguish between issues of data privacy, data commercialization and predictive data modeling. By separating and clarifying threads of the discussion we can better understand the ethical questions in the discussion, avoiding the politicized binary in the debate. I will address each of these issues in separate posts, beginning here with a discussion of some notable successes and failures around data privacy. This will allow us to respond to the technological needs of stakeholders with greater care and efficiency.


The ethical concerns around big data privacy became increasingly pronounced in 2014, and this discussion has become very visible in education technology. The very public collapse of InBloom in failing to properly communicate their policies on data handling helped to bring these questions to the forefront. As researchers Jules Polonetsky and Omer Tene have described it, InBloom’s rapid expansion “brought to the fore weighty policy choices, which required sophisticated technology leadership and policy articulation.” The necessary leadership and articulation was never achieved, and ultimately the public outcry against InBloom forced many districts to end their relationship with company. The lesson to be learned here is not that big data analytics are a failed solution to the problems of our education system, but that big data practices need to be more carefully articulated. Education data needs to be focused on generating demonstrable learning outcomes, rather than the mere collection of data itself. One notable feature of InBloom’s failed policy was the move to collect as many data points about students as possible, rather than targeting data collection practices around specific hypotheses. The “collect and then measure” approach to education data is a mistake. Ed tech companies need to be able to say why they are collecting each data set and exactly what that data is going to achieve. Data practices must also include robust and simple privacy policies and be explained in ways that are easily understandable to the public if they are to succeed.


Read More

Creating Ethical Resolve

Having ethical resolve in some ways is as straightforward as having a commitment to acting with integrity and responsibility. In practice, however, having true ethical resolve is examining ethical decisions as an ongoing, informed and engaged process. So a commitment to ethical responsibility and integrity is not coming to a single ‘stance’, as mission statements have us believe, or coming to a single ‘solution’ to a particular issue but a re-solving process that must be a part of how a responsible and effective company functions.