toplogo
Войти

Normware: A New Paradigm for Designing Artificial Devices within Human Institutions


Основные понятия
The integration of computational systems within human institutions requires a new approach beyond traditional code-driven or data-driven law, necessitating the development of "normware" to manage the complexities of regulation, qualification, and expectations in socio-technical systems.
Аннотация

This article presents a conceptual framework for "normware," a proposed third level of computation beyond hardware and software, aimed at addressing the limitations of code-driven and data-driven law in regulating artificial devices within human institutions.

The author argues that both code-driven and data-driven law, while seemingly offering legal certainty, are susceptible to challenges. Text-guided law, despite its reliance on open-textured concepts and multi-interpretability, allows for contestability and adaptation, a feature missing in purely computational approaches.

The article proposes "normware" as a solution, encompassing computational artifacts that regulate behaviors, qualifications, and expectations. Normware artifacts can range from access control policies to machine learning models, all interpreted as coordinating mechanisms within a larger socio-technical system.

The author emphasizes the importance of viewing normware as a process, acknowledging the dynamic and often conflicting nature of directives within complex systems. Drawing parallels with second-order cybernetics, the article highlights the need for feedback loops and conflict resolution mechanisms to manage competing directives and ensure system viability.

The article concludes by outlining research directions for normware, including the development of languages for specifying normware artifacts and mechanisms for resolving conflicts between them. The author stresses the importance of a collaborative approach, involving stakeholders from various disciplines, to ensure the responsible and effective integration of artificial devices within human institutions.

edit_icon

Настроить сводку

edit_icon

Переписать с помощью ИИ

edit_icon

Создать цитаты

translate_icon

Перевести источник

visual_icon

Создать интеллект-карту

visit_icon

Перейти к источнику

Статистика
Цитаты
"What code-driven law does is to fold enactment, interpretation and application into one stroke, collapsing the distance between legislator, executive and court." "In short, the core challenge of regulatory technologies is not captured by the requirements expressed in software engineering terms, but by the construction of proper, functioning feedback loops, in the form of intervention points for social actors provided with relevant competence." "Accepting these two principles leads us to technological innovations which have been neglected so far, addressing primarily computational instruments that operationalize appeal and quashing/overruling within the computational realm." "Differently from hardware and software, which are primarily defined in terms of control, we expect a piece of normware not to be used for control, but primarily for guidance of the computational system."

Ключевые выводы из

by Giovanni Sil... в arxiv.org 10-24-2024

https://arxiv.org/pdf/2410.17257.pdf
Code-Driven Law NO, Normware SI!

Дополнительные вопросы

How can the development and implementation of normware be governed to ensure ethical considerations and prevent unintended consequences?

Governing the development and implementation of normware requires a multi-faceted approach that addresses ethical considerations and mitigates unintended consequences. Here's a breakdown of key aspects: 1. Design Principles and Values: Transparency and Explainability: Normware systems should be designed for transparency, enabling users to understand how decisions are made and which norms are being applied. Explainable AI (XAI) techniques can be crucial in achieving this. Fairness and Non-Discrimination: Bias in training data or design choices can lead to discriminatory outcomes. Rigorous testing, diverse datasets, and fairness-aware algorithms are essential to ensure equitable normware. Human Oversight and Control: While normware aims to automate certain aspects of regulation, human oversight mechanisms are crucial for accountability and intervention in case of errors, biases, or unforeseen situations. Contestability and Appeal: Individuals should have clear avenues to contest normware-driven decisions and appeal outcomes. This necessitates accessible mechanisms for review and redress. 2. Development Process: Interdisciplinary Collaboration: Developing ethical normware demands collaboration between computer scientists, legal experts, ethicists, social scientists, and stakeholders from affected communities. Impact Assessments: Thorough impact assessments should be conducted throughout the development lifecycle to anticipate potential consequences, identify biases, and address ethical concerns proactively. Sandboxing and Testing: Before widespread deployment, normware systems should undergo rigorous testing in controlled environments (sandboxes) to identify and rectify issues. 3. Regulatory Frameworks: Adaptive Governance: Given the evolving nature of technology, a flexible and adaptive governance framework is needed. This might involve establishing new regulatory bodies or adapting existing ones to oversee normware development and deployment. Liability and Accountability: Clear legal frameworks are necessary to determine liability and ensure accountability in case of harm caused by normware systems. This includes defining the responsibilities of developers, deployers, and users. International Standards and Cooperation: International cooperation is vital to establish common ethical principles, standards, and best practices for normware development and use. 4. Public Engagement and Education: Public Discourse: Fostering open public discourse about the ethical implications of normware is crucial to ensure societal alignment and address concerns. Education and Literacy: Promoting digital literacy and educating the public about normware systems empowers individuals to engage critically with this technology. By embedding these principles into the governance of normware, we can strive to harness its potential while mitigating risks and ensuring its responsible and ethical development and implementation.

Could the reliance on normware lead to a decrease in human oversight and accountability within critical institutional processes?

The reliance on normware carries the potential risk of diminishing human oversight and accountability within critical institutional processes, but this outcome is not inevitable. Here's a nuanced perspective: Risks of Decreased Oversight and Accountability: Automation Bias: Overreliance on normware's outputs, even when flawed or incomplete, can lead to automation bias, where human judgment is sidelined. Erosion of Expertise: If human involvement in decision-making processes decreases, it could lead to an erosion of expertise and critical thinking skills within institutions. Opacity and Complexity: Complex normware systems can become opaque, making it challenging to trace back decisions and assign responsibility. Shifting Blame: The presence of normware might create a tendency to shift blame for unfavorable outcomes onto the technology itself, obscuring human accountability. Mitigating the Risks: Meaningful Human Oversight: Design normware systems with "human-in-the-loop" approaches, ensuring that critical decisions require human review and approval. Auditing and Transparency: Implement robust auditing mechanisms to track normware decisions and ensure transparency. This allows for retrospective analysis and accountability. Accountability Frameworks: Establish clear lines of accountability for normware-driven decisions, specifying the roles and responsibilities of developers, deployers, and human overseers. Cultivating Expertise: Invest in training and education to ensure that individuals working with normware possess the necessary skills to understand, interpret, and critically evaluate its outputs. Normware as an Enhancement, Not a Replacement: The key is to view normware as a tool to enhance, not replace, human judgment and accountability. By carefully designing normware systems with appropriate safeguards and maintaining a focus on human oversight, we can mitigate the risks of decreased accountability.

What are the implications of viewing human institutional systems as information processing systems, and how does this perspective challenge traditional notions of agency and responsibility?

Viewing human institutional systems through the lens of information processing systems offers a valuable analytical framework but also challenges traditional understandings of agency and responsibility. Implications: Systemic Perspective: This perspective highlights the interconnectedness and flow of information within institutions, emphasizing how rules, procedures, and communication shape outcomes. Process Orientation: It shifts focus from individual actors to the processes themselves, examining how information is collected, processed, and used to make decisions. Identifying Bottlenecks and Inefficiencies: Analyzing institutions as information systems can reveal bottlenecks, redundancies, and areas for improvement in efficiency and effectiveness. Designing Interventions: This perspective can guide the design of interventions, such as changes to information flows or decision-making structures, to achieve desired outcomes. Challenges to Traditional Notions of Agency and Responsibility: Diffuse Agency: If institutions are primarily information processing systems, agency becomes more diffuse. It becomes harder to pinpoint individual responsibility when decisions emerge from complex interactions within the system. The Role of Structure: Emphasis on information flows highlights how institutional structures and rules, often designed with specific goals, can shape behavior and outcomes, potentially limiting individual agency. Accountability in Complex Systems: Assigning responsibility for unintended consequences or systemic biases becomes more challenging when actions are distributed across numerous actors and processes. Reconciling Traditional Notions with the Information Processing View: Shared Responsibility: While individual agency might be more diffuse, it doesn't disappear. Individuals within institutions still make choices and take actions within the constraints of the system. This suggests a model of shared responsibility. Transparency and Accountability Mechanisms: To address the challenges of diffuse agency, institutions need robust transparency mechanisms to track information flows and decision-making processes. This enables better auditing and accountability. Ethical Design of Systems: Recognizing institutions as information processing systems underscores the importance of ethical design. The values embedded in institutional structures, rules, and information flows will significantly influence outcomes. In conclusion, while viewing human institutions as information processing systems offers valuable insights, it requires a nuanced understanding of agency and responsibility. By acknowledging the interplay of individual actions, institutional structures, and information flows, we can develop more effective and accountable systems.
0
star