ISO/IEC TR 20547-1:2020 is the first part of the ISO/IEC 20547 series of standards, published in March 2020. As a technical report (TR) rather than an international standard (IS), its positioning is to provide a conceptual framework and guiding principles for big data reference architecture, rather than mandatory specifications.
This standard originates from a response to industry demand and aims to address the following challenges:
Lack of unified terminology and conceptual models in the field of big data
Interoperability issues caused by differences in architecture of various big data systems
Lack of standardized reference framework for big data project implementation
Understanding bias in cross organizational collaboration
Standard content
Framework and Application Process: This standard provides a detailed description of the framework for big data reference architecture and outlines the process of applying the framework to specific problem domains. This includes steps such as identifying stakeholders and concerns, mapping stakeholders and concerns to roles and sub roles, developing detailed activity descriptions, and defining functional components.
Stakeholders and Concerns: The standard specifies the stakeholders and their concerns in big data architecture, and describes these concerns through views such as user views and functional views.
Reference architecture elements: Detailed description of the various components of the reference architecture, including stakeholders, focal points, views, etc.
Application scenarios
ISO/IEC TR 20547-1:2020 is applicable to organizations that need to design, develop, and implement big data solutions. It can help organizations achieve consistency and standardization in big data projects, thereby improving project manageability and scalability.
In addition, this standard provides a common reference framework for suppliers and users of big data technology, which helps promote technical exchange and cooperation.
• Building a common language: Unifying terminology and conceptual models greatly promotes effective communication and collaboration across teams, organizations, and vendors.
• Architecture Design Guidelines: Provide systematic methodology and best practice references for designing complex big data solutions, avoiding "reinventing the wheel" and architectural pitfalls.
• Promote interoperability and integration: By providing clear component definitions and interface specifications, reduce the difficulty and cost of integrating different technology stacks, protect investments, and avoid vendor lock-in.
• Support technology selection and evaluation: Provide an objective framework for evaluating and comparing different big data products, platforms, and services.
• Ensure non functional requirements: Emphasize key quality attributes (security, scalability, fault tolerance, etc.) to guide the construction of robust, reliable, and secure big data systems.
• Promote standardization process: As the foundation of a series of standards, promote more extensive and in-depth standardization work in the field of big data.
• Reduce risks and costs: Adhere to validated architecture frameworks, reduce project failure risks, improve development efficiency, and optimize resource utilization
Enterprise Architect and Solution Architect: Design and plan big data platforms.
IT Managers and Decision makers: Evaluating Technology Selection and Investment Strategies.
Data engineers and data scientists: understanding system context and better utilizing platform capabilities.
System integrators and software suppliers: develop products and solutions that meet standards and are easy to integrate.
Standard setting agencies and researchers: serving as the foundation for further research and standardization.
Wechat ID:Siterui888888
Add a wechat friend to get free plans and quotations