According to the EU’s General Data Protection Regulation, transparency should be considered by design in data processing activities. Transparency entails the promise of control and legitimacy: if we can see inside algorithmic systems, we can ensure compliance with legal rights and principles. But is “by design” able to ensure compliance? I interrogate the relationship of law and technology by asking how law can capture the products and intricacies of design processes. Combining socio-legal and science and technology studies, I argue that “transparency by design” does not exert meaningful control. I assert that design should be understood not only as production of algorithms but as human-driven contextual social processes, in which values are prioritized and negotiated, ignored and assumed, and at times fought over and compromised. Design processes often lack transparency and democratic participation, leading to legitimacy gaps. Yet transparency of design is not at the core of data protection. Despite the limitations of transparency, transparent design would make these social practices explicit and reintroduce participation. Furthermore, it repoliticizes technological design by creating space for value prioritization and operationalization. The shift to design facilitates a discursive turn to procedural language of access to justice. If we prioritize access alongside transparency as a guiding design constraint, the humans involved in design processes and interacting with algorithmic systems become visible, giving us new tools, e.g., measurable accessibility and usability, for legally informed technological design.
CITATION STYLE
Koulu, R. (2021). Crafting Digital Transparency: Implementing Legal Values into Algorithmic Design. Critical Analysis of Law, 8(1), 81–100. https://doi.org/10.33137/cal.v8i1.36281
Mendeley helps you to discover research relevant for your work.