Like braids of a rope, scientific and technical knowledge, policy and law intertwine to produce rules and permissions, inserting tech into daily life
- These processes lie on a continuum between the democratic - where scientific knowledge arises through a social process and its values that underpin how decisions are made - and technocratic, a perspective favoured by commercial and industrial interests, where the'solution is to get more and better science into the decisions.'
Risk Beyond Privacy
- New technological frontiers weld biometric and digital identity data into the mainframes of governments and large private institutions
- Partnerships with the private sector are ordinary, industry consultants provide expertise, apps and plugins enhance framework operability, while creating new opportunities to manage information
- The closed-door public-private arrangements carry with them the potential for systemic and sustained abuse of power
New Zealand
- In New Zealand, new legislation, the Digital Identity Services Trust Framework Bill is underway
- The Bill provides for the establishment of a 'trusted framework' authority and board, who will be responsible for guidance and oversight of the framework
- For the service providers, it's an opt-in framework, and a fee-paying model
- Unfortunately, regulatory environments are a product of institutional culture and resourcing
- When a service is paid for, ultimately, the providers must think like the institutions they are paid to regulate.
Power and Social Control
- Environments shape knowledge systems, whether at the individual level, for the official in government, or at the population level.
- Surveillance is one form of knowledge aggregation, and is accepted by the public in order to promote national security
- When civil society understand or suspect surveillance, society is more likely to modify its behavior
Innovation Has Displaced Knowledge
- Techno-scientific culture is the inevitable consequence of four decades of innovation-centric policies that valorize research and science for economic gain
- Science and technology for innovation has displaced public good basic science
- Innovation produces new knowledge and valuable patents
- For techno-scientific, economic growth centric policymakers, benefit dovetails for society, the economy, and the commercial developer
The Point Where Stewardship Falters
- There are two major steps to getting technologies on the market and keeping them there
- The introduction and authorising of technologies when they are new, when we don’t know much about them
- Then, there is the process of understanding what happens as the social and scientific literature builds a picture of risk or harm; and adjusting policies to ensure that human and environmental health are protected
Blackboxing Knowledge and Risk
- The shifts to defer to industry science, favours under-regulation of technologies at least five ways
- Development of complex laws and technical guidelines that can narrowly codify regulatory logics away from broader understanding of risk
- Through stakeholder networks, dominant industries with COIs secure privileged access to the development of policy
- through the primacy of commercial in confidence and data protection agreements that set aside democratic norms of transparency
- In the absence of non-industry funded research and science that might identify and understand complex risk scenarios otherwise downplayed by industry science and regulatory frameworks
- (and relatedly) through the failure to include scientific expertise who can then feedback into regulatory and policy arenas, triangulate and contest industry claims
Governance policy processes drenched in conflicts of interest
- Institutional networks and early access to policy development create profound power asymmetries, keeping publics, including indigenous, civil and human rights groups at arms' length
- The COIs are buried in secret data, governance arrangements and system architecture
- Power exerts itself in many ways, it can be instrumental (such as lobbying power), structural (based on size and insight from business activities), and discursive power - the power to promote ideas and shape social, economic and cultural perspectives
Commercial in Confidence Agreements
- Contrary to democratic norms of transparency, industry data required by regulators for decision-making is ordinarily kept secret due to commercial in confidence agreements (CICAs).
- At risk of being heretical, are CICAs modernity’s Ark of the Covenant? Housing valuable secrets most can’t see, that only a privileged few ever had have access to?
- Does the sheer quantity of these agreements that are now held by governments, inevitably corrupt the original purposes of the agreements? Could they be weaponizing them instead?
The absence of non-industry science
- Governments don't meaningfully fund our public science institutions or our regulators
- CICAs often prevent access to compounds and technologies so that independent scientists can research them
- Independently produced science and research can identify unknown, off-target and unanticipated risks that may be outside policy or regulatory consideration
- Public good science can explore chemistry, biology, and integrate new techniques
- The type of research that can analyze new knowledges about the technologies as the literature paints a picture of risk or harm
Regulators in name only
- Regulators are simply never granted investigative, inquisitorial powers
- Technology and chemical regulators ordinarily lack meaningful budgets to detect anomalies, disruptions and threats before harm occurs
- What could we require of regulators?
- That they conduct methodological reviews of the published science
- Report on legal decisions from offshore jurisdictions
- Demand that public scientists fill in the gaps left unmet by the industry science and data provision
Digital Expansionism
- These shifts have encouraged policy, legal, and regulatory cultures that marginalize and set aside a language of risk that should encompass uncertainty and complexity
- They set aside, and outrightly dismiss, democratic norms, such as transparency and accountability
- Manmade emissions and exposures are all encompassing, permeating daily life and resulting in the subjection of the individual to potentially harmful technologies from conception
Repurposing Potential Built in to System Architecture
- The out-of-control burgeoning risk, appears to now underpin digital identity systems where 'trust' and'responsibility' is designed by the institutions with the COIs
- Risk pivots from emissions or exposures, to risk from surveillance and policy instruments
- These instruments contain exceptional potential to nudge, coerce, and force compliance in daily life, distorting personal autonomy and sovereignty
Cultural Capture
- Opaque digital identity systems and the co-existing governmental and private sector frameworks can be repurposed to shape behavior
- The default position of relying on industry science to underpin policy is a function of the decline of public good science and the rise of industry power
- Industry knowledge and expertise, and industry culture pervades the drafting of related laws and guidelines
- When cultures are captured, the industry data is imagined as 'apolitical' while the publicly produced data is viewed as political and controversial
https://brownstone.org/articles/a-revolution-under-the-cloak-of-normalcy/
Comments
Post a Comment