Svoboda | Graniru | BBC Russia | Golosameriki | Facebook
Next Article in Journal
Parameter Optimization Method for Metal Surface pBRDF Model Based on Improved Strawberry Algorithm
Previous Article in Journal
A Review of Recent Literature on Audio-Based Pseudo-Haptics
Previous Article in Special Issue
Mechatronic Device Used to Evaluate the Performance of a Compliant Mechanism and Image Processing System in Determining Optometric Parameters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combining BioTRIZ and Multi-Factor Coupling for Bionic Mechatronic System Design

by
Bingxin Wang
* and
Dehong Yu
School of Mechanical Engineering, Xi’an Jiaotong University, No.28 Xianning West Road, Xi’an 710049, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(14), 6021; https://doi.org/10.3390/app14146021
Submission received: 3 June 2024 / Revised: 6 July 2024 / Accepted: 9 July 2024 / Published: 10 July 2024
(This article belongs to the Special Issue Mechatronics System Design in Medical Engineering)

Abstract

:
To realize the design process of bionic mechatronic systems, involving mapping from engineering to biology and inversion from biology to engineering, a novel design paradigm is introduced that integrates BioTRIZ with multi-factor coupling bionics. In the mapping stage from engineering to biology, BioTRIZ is employed to frame the concrete engineering issue as a general conflicting problem. The biological solution is refined by amalgamating the BioTRIZ solution derived from the contradiction matrix with biological instances. In the inversion stage of biology to engineering, a novel approach is proposed for constructing a bionic multi-factor coupling model, drawing inspiration from the establishment of biological multi-factor coupling model. This allows for a seamless correspondence between biological elements, such as morphology and behavior, and their respective engineering counterparts, including structure and algorithms. This correspondence ultimately achieves the engineering conceptual model that is rooted in biological principles. The practical application of this methodology is exemplified through a multi-biometric fusion bionic active vision system, underscoring its feasibility and efficacy.

1. Introduction

To adapt to changes in climate, habitat, and food sources, creatures have undergone diverse morphological, structural, and functional evolutions through “natural selection”, enabling them to thrive in their respective environments. These adaptations have garnered significant interest not only from biologists but also from engineers seeking to harness this biodiversity through bionic principles [1,2,3,4,5,6].
Problem-driven biomimetic approaches aim to discover solutions in the biological realm for specific engineering issues. Nevertheless, evolution often reflects compromises among multiple survival necessities. For instance, the evolution of the finch beak balances various functions, such as singing, attracting mates, and grooming, rendering its structure not necessarily optimal solely for grinding food [7]. When applied to enhance the pulverizing efficiency of a jackhammer, the outcome is not an optimal but a feasible solution. On the other hand, solution-driven biomimetic approaches necessitate extracting relevant knowledge from extensive studies of living organisms to create or enhance artificial products [8,9]. However, in the context of mechatronic systems, which integrate structures, drivers, control algorithms, and more, abstracting biological knowledge into a singular mathematical, physical, or structural model not only falls short of fully describing the system overall functionality but also struggles to represent the relationships between its engineering modules.
The design of a bionic mechatronic system necessitates the realization of a comprehensive process that involves an engineering-to-biology mapping and biology-to-engineering inversion. To this end, we introduce a bio-inspired design paradigm that integrates BioTRIZ [10] with the Extensive Model of Multi-factor Coupling Bionics (EM-MCB) [11,12]. This paradigm is based on relationship mapping inversion. We employ bionic active vision system as a representative example to illustrate the design framework of the bionic mechatronic system, which is depicted in Figure 1.
In the mapping stage from engineering to biology, the engineering issue is first abstracted into a BioTRIZ conflicting problem. Through the contradiction matrix, the BioTRIZ solution is obtained. It is then combined with multiple prototypes from biological database to form a complete biological solution. This combination enables the analysis of common features and evolutionary trends observed in various creatures, and allows for a systematic analysis of the engineering requirements and a widened search range of biological instances.
In the inversion stage of biology to engineering, in order to correspond the morphological and behavioral elements of the biological system with the structure, algorithm, and other modules of the mechatronic system, a biological coupling model and an extensive model are established based on the principle of multi-factor coupling bionics. By establishing the bionic coupling model and extensive model analogously, a correspondence with the biological models is formed at the coupling element level, which in turn converts the biological solution into an engineering conceptual model.

2. Related Works

Both problem-driven and solution-driven approaches in bio-inspired design necessitate bridging the gaps between disciplines through knowledge cross-domain mapping. Vincent et al. [10] proposed BioTRIZ, which integrates classical TRIZ theory with bionic principles, to establish a “bridge” between engineering and biology. Snell-Rood et al. [13] expanded the search scope for biologically inspired solutions by transforming engineering issues into biological ones based on the concept of “functionality”. Bian et al. [14] leveraged the pre-trained BERT model [15] to assess the semantic similarity between engineering and biological terms, facilitating bionic inference. Deng et al. [16] proposed a human–machine collaborative deep generative model for bionic design, visually depicting potential mappings between biomorphic forms and product shapes. Kruiper et al. [17] developed the Focused Open Biology Information Extraction (FOBIE) dataset, which is aimed at discovering relevant cross-domain scientific literature for bionics research. Numerous other scholars [18,19,20,21,22,23] have also contributed various computer-aided tools that rely on functional similarity to perform the crucial cross-domain mapping of knowledge in bionic design.
The fundamental objective of bionics is to harness insights from biological instances to tackle engineering issues, and a thorough understanding of biological prototypes serves as the foundation for achieving this. In pursuit of this goal, Ren et al. [11,12] introduced the Extensive Model of Multi-factor Coupling Bionics (EM-MCB). This model offers an efficient tool for engineering biomimetic design by delving into the mechanisms of biological prototypes and elucidating the principles of biological coupling. Nagel et al. [24] developed AskNature, an online database that employs functional representation and abstraction techniques to identify and access biological prototypes. Abdala et al. [25] created a knowledge base focused on biological effects grounded in TRIZ principles. Liu et al. [26] evaluated and selected biological prototypes based on topological, role, strategic, and structural similarity. Mak et al. [27] proposed a hierarchical approach that organizes biological phenomena into form, behavior, and principle. This structure presents a latent analogy that can inspire design solutions. Cao et al. [28] suggested abstracting biological prototypes based on their function, behavior, and structure. They further evaluated the fitness of these prototypes for engineering applications using a fuzzy triangular numbers-based algorithm. Hou et al. [29] designed a knowledge base focused on multi-biological effects and integrated TRIZ to establish a design process model by combining product functions. Bai et al. [30] combined BioTRIZ with biological coupling analysis, orthogonal analysis, and scheme merit value calculation to construct a multi-biological prototype bionic model.
In brief, bionics research has frequently focused on particular links within the biomimetic chain. Scholars endeavor to refine tools and bridge disciplinary divides, rather than address the entirety of the complex process. Much of these works have revolved around single-factor bionics. With the focus gradually shifting towards multi-factor bionics, this paper aims to combine methodology and practical application to explain the bio-inspired design paradigm, and use a typical mechatronic system to elucidate this multi-factor bio-inspired design framework more clearly.

3. Engineering-to-Biology Mapping

3.1. Problem Description

Vision, as a crucial avenue for humans and numerous vertebrates to gather environmental information, has garnered extensive research in biology [31,32,33,34,35]. In an effort to endow robots with comparable perceptual abilities, bionic active vision systems have become a significant focus of interest in recent years [36,37].
“Eyes in the front, the animal hunts. Eyes on the side, the animal hides” encapsulates the inherent trade-off between a wide vision field and precise visual localization in nature. Predators, with their powerful stereoscopic vision, can pinpoint the exact location of their prey, but may sacrifice a comprehensive awareness of their surroundings. Conversely, prey animals possess a broad vision field that allows them to detect potential predators in their environment, but this comes at the cost of developing binocular stereo vision, making precise localization nearly impossible.
The majority of existing bionic active vision systems adopt a binocular structure, and are utilized for environmental monitoring, situational awareness, object detection, and tracking, among other applications [38,39,40,41]. These systems typically emulate the visual system of a specific vertebrate [42,43,44], either by modeling binocular stereo vision for 3D measurements or by replicating a wide vision field for environmental monitoring and scene comprehension. Additionally, there are visual systems that enable switching between these two functions through mimicking eye movements, offering a compromise between the two, but they cannot simultaneously acquire both global vision and precise information [45].
The application environment of machine vision is increasingly unstructured and complex. Our objective is to devise a bionic vision system that can simultaneously maintain a global vision field and acquire accurate local information. To achieve this, we incorporate a bio-inspired design paradigm that implements engineering-to-biology mapping and biology-to-engineering inversion during the design process.

3.2. BioTRIZ

Engineering-to-biology mapping is essentially a cross-domain knowledge translation. Through numerous patent analysis, TRIZ [46] has revealed that every creative patent is essentially solving a conflicting problem, and that the basic principles for addressing these contradictions are highly reusable.
Possessing a wide vision field and acquiring precise information often present a conflicting pair: an improvement in one aspect typically leads to a deterioration in the other. In this context, the application of TRIZ becomes a viable solution. Notably, TRIZ is a systematic theory that emerges from the refinement and reorganization of existing knowledge across diverse fields [47]. Its strength lies in the ability to address problems in various domains using common principles, facilitating the knowledge transfer from one field to another. This aligns well with the fundamental requirements of bio-inspired design, which aims to translate functions, structures, and principles across different domains [48].
TRIZ has distilled 39 widely applicable Engineering Parameters (EPs) from numerous patents and introduced 40 universally relevant Inventive Principles (IPs). These IPs are grounded in the common “technical contradiction” where one parameter improves while another deteriorates. However, TRIZ originated in the field of things artificial, non-living, technical, and engineering. To integrate TRIZ with the biology domain and cater to the demands of bionics, Vincent et al. proposed BioTRIZ [10]. BioTRIZ replaces the 39 EPs with 6 Operational Fields (OFs): substance, structure, space, time, energy, and information. Furthermore, it established the BioTRIZ contradiction matrix, as presented in Table 1. This shift to OFs not only condenses the original EPs and IPs of TRIZ, but also simplifies and streamlines the invention process, making it more logical and accessible.
We apply BioTRIZ in the mapping stage from engineering to biology, as illustrated in Figure 2. Initially, the engineering issue is formulated as a BioTRIZ conflicting pair. Subsequently, a BioTRIZ solution is retrieved from the contradiction matrix. Ultimately, this solution is integrated with biological instances to derive a complete biological solution.

3.3. BioTRIZ Solution

To resolve the problem presented in Section 3.1, the engineering contradiction between achieving a wide vision field and precise information acquisition is initially converted into a BioTRIZ contradiction.
The parameters related to the vision field in a bionic binocular vision system typically encompass factors such as the viewing angle of image sensors and the direction of optical axis related to the system midline, as illustrated in Figure 3. The image captured by cameras encompasses a three-dimensional space. If the sensor is mounted on a gimbal, it can be regarded as a moving object; otherwise, it is considered a stationary object. Consequently, the issues pertaining to the vision field can be framed as
  • EP7 Volume of moving object; or EP8 Volume of stationary object.
The direction of optical axis determines the appearance of the system, which can be abstracted as
  • EP12 Shape.
Furthermore, a wide vision field results in a lack of precise information, and a narrow vision field leads to insufficient global information, which can be abstracted as
  • EP24 Loss of information.
According to Appendix 2 of the literature [10], EP7, EP8, and EP12 fall under the space field, whereas EP24 pertains to the information field. Consequently, the BioTRIZ problem we aim to address is to prevent deterioration of the information field when optimizing the space field. This aligns with the observations made in biological vision system, where both predator and prey vision systems experience a loss of either global or local information. The intersection of the space field and the information field in Table 1 represents potential BioTRIZ solutions, including the following IPs:
  • IP3 Local Quality.
    -
    IP3-1: Change an object’s structure, action, environment, or external influence/impact from uniform to non-uniform.
    -
    IP3-2: Make each part of an object function in conditions most suitable for its operation.
    -
    IP3-3: Make each part of an object fulfil a different and/or complementary useful function.
  • IP15 Dynamics.
    -
    IP15-1: Change the object (or outside environment) for optimal performance at every stage of operation, make them adaptable.
    -
    IP15-2: Divide an object into parts capable of movement relative to each other.
    -
    IP15-3: Change from immobile to mobile.
    -
    IP15-4: Increase the degree of free motion.
  • IP21 Skipping.
    -
    Conduct a process or stages of it (e.g., destructive, harmful, hazardous operations) at high speed.
  • IP24 Intermediary.
    -
    IP24-1: Use an intermediary carrier article or intermediary process.
    -
    IP24-2: Merge one object temporarily with another.

3.4. Biological Prototypes

Combining the IPs with biological instances can result in a complete biological solution, as presented in Table 2. We utilize strategies related to “eyes” and “vision” from the AskNature online database [24] as biological prototypes. The biological instances indicate several key evolutionary directions in solving survival problems similar to our engineering issues:
  • Adding auxiliary eyes to enhance the existing capability of visual systems.
  • Differentiating the functional roles of the left and right eyes, which originally served identical purposes, allowing them to independently fulfill distinct functional requirements.
  • Modifying the internal structure of the eyes to enable different parts to perform specialized functions, thus increasing the versatility and efficiency of the visual system.
  • Enhancing the range and speed of eye movements to achieve functional transitions between a wide vision field and local information acquisition.
A single biological instance often balances multiple survival needs, making it difficult to derive optimal solutions to engineering problems by mimicking a specific species. Instead, we aim to integrate various demand-fulfilling features from the above evolutionary directions to effectively address engineering issues.

4. Biology-to-Engineering Inversion

4.1. Extensive Model of Multi-Factor Coupling Bionics

After obtaining the biological instances and inventive principles, the biological solutions need to be inverted to the engineering domain, as depicted in Figure 4. In traditional single-factor bionics, engineers tend to focus on a single aspect of morphology, structure, or neural mechanism and design the bionic model or algorithm based solely on that factor. Nevertheless, the adaptive functions displayed by organisms are actually the outcome of the interplay between various related factors. Likewise, in a mechatronic system, the mechanical structure, actuator, and software must collaborate seamlessly. In this case, it is difficult to describe the interactions between these engineering modules using solely mathematical, physical, or structural model.
To provide a comprehensive understanding of the biological coupling mechanism, Ren et al. introduced the Extensive Model of Multi-factor Coupling Bionics (EM-MCB) [11,12]. This model defines the diverse elements that impact biological functions as Biological Coupling Elements (BCEs) and the ways of interaction between these elements as Biological Coupling Ways (BCWs). To mirror similar coupling mechanisms in mechatronic systems, we have analogously established an EM-MCB framework in the engineering realm. Within this framework, the elements that shape system functions are designated as Engineering Coupling Elements (ECEs), while the ways of association between these modules are termed Engineering Coupling Ways (ECWs). By integrating the multi-factor coupling model of the bionic system with the biological solution, we arrive at the engineering conceptual model. Once the structure, specifications, and algorithms of this conceptual model are firmly established, it can be instantiated into engineering forms.

4.2. Biological Model

A biological model of EM-MCB comprises two primary components: a coupling model and an extensive model. The coupling model delves into the biological coupling mechanism, identifying all the coupling elements that impact the biological function. The extensive model, rooted in the theory of extenics primitives and the theory of conjugate analysis [49], treats all these coupling elements as the hard part and the relationships between them as the soft part.
To synthesize the visual advantages observed in various biological prototypes, we have established a biological coupling model that is based on the shared features of vertebrate oculomotor and ocular structures, as illustrated in Figure 5. From this model, it becomes evident that visual perception arises from the integration of multiple components, including eye layout, eyeball structure, and the optic nerve. The extraocular muscles play a crucial role in controlling eye movements such as scanning and fixating. Meanwhile, the optic nerve implements visual attention and transmits sensory stimuli to the cerebral cortex for higher-level visual processing.
Importantly, the number of BCEs included in the model can be expanded based on the specific research objectives. Similarly, the descriptions of BCEs can be progressively refined in a hierarchical manner to meet the evolving needs of the study. This flexibility allows the model to serve as a dynamic tool for exploring and understanding the complexities of vertebrate vision.
After further analyzing BCEs and referring to the values provided in [50,51,52,53,54,55,56,57,58,59,60], the BCEs depicted in Figure 5 can be described as follows:
M 1 = Structure Binocular Direction of Optical Axis : 0 90 Complete Vision Field : 150 360 Binocular Overlap : 10 124
M 2 = Structure Eyeball Vision Field : 105 190 Fovea Num : 0 2
M 3 = Material Optic Nerve Length : 42 mm 47 mm
M 4 = Structure Extraocular Muscle Num : 6 Rotation : Horz , Vert Type : Rectus , Obliques
M 5 = Behavior Eye Movement Range : 1 180 Speed : 600 / s
The BCWs can be described as follows:
R = Coupling Way Previous : M i Next : M j Coupling Degree : v 3 Position Relationship : v 4 Permanency : v m
In this matrix, M i and M j represent the i-th and j-th BCE, respectively, where both i and j are integers satisfying 1 i , j 5 . And v 3 , v 4 , and v m satisfy the following formulas:
v 3 = { Compounding , fusion , compounding , chimerism , unionization , . . . } v 4 = { Top - - Bottom , Left - - Right , Front - - Back , Stacked , . . . } v m = { Permanent , temporary }
The biological EM-MCB can be summarized as follows:
B = M 1 M 2 M 3 M 4 M 5 R = Biological Coupling Function Monitoring , hunting , hiding , etc . BCE M 1 M 2 M 3 M 4 M 5 BCW R Working Env Air Water
where ⊕ signifies the generalized connection between coupling elements, while ∧ represents logical AND.

4.3. Bionic Model

The bionic EM-MCB aims to analogize and modify the biological EM-MCB based on engineering requirements through the application of bionics principles. This model not only embodies the principles of biological coupling but also incorporates engineering needs.
Referring to the biological model, a bionic EM-MCB also encompasses a coupling model and an extensive model. Based on the commonly accepted structure of a bionic active vision system [40,41,42,43,44,45,46], a coupling model for the bionic system is established, as illustrated in Figure 6. The perception of the bionic visual system is influenced by a range of factors including binocular configuration, image sensors, information transmission, and more. The motion mechanism is used to mimic the eye movements of vertebrates. Bionic algorithms are employed to accomplish visual perception tasks. These ECEs align closely with those of the biological BCEs: image sensor corresponds to eyeball, mechanism to extraocular muscles, motion control to eye movement pattern, and so on. Likewise, the ECEs in the model can be increased as the research progresses, and the description of a single ECE can be further stratified and refined according to the requirements.
Analogous to Figure 5, we have constructed the bionic coupling model tailored for engineering applications, as depicted in Figure 6. Owing to different specific applications, there exist notable variations in the implementation of sensors, algorithms, and performance metrics employed in bionic systems. In an extensive model, features must be quantitatively represented. Our objective is to introduce a general conceptual framework, so we will refrain from delving into constructing a specific extensive model.

4.4. Engineering Conceptual Model

By combining the BioTIRZ solution, the shared characteristics of biological instances, and the bionic EM-MCB, we can ultimately derive an engineering conceptual model, as illustrated in Figure 7.
The bionic active vision system must concurrently scan the global visual field and capture local precise information. To achieve this, the following ECEs have been implemented:
  • ECE1 Binocular Configuration.
    -
    Drawing inspiration from the binocular arrangement of prey animals, we have adopted a lateral layout with optical axes perpendicular to the system midline, ensuring a wide vision field.
  • ECE2 Image Sensors.
    -
    Inspired by the visual enhancement observed in auxiliary eyes of jumping spiders and six-eyed ghostfish, the retinal and foveal advantages in visual information acquisition, the double foveal distribution in raptors [50,51,55,60], and recommendations from IP3-2, we have increased the number of image sensors. Wide-angle cameras O 1 and O 2 perform global searching, while narrow-angle cameras C 1 , C 2 , C 3 , and C 4 are dedicated to capturing local details. In Figure 7, assuming random target appearance within the panoramic range, four cameras are positioned at 90° intervals, echoing the auxiliary eye count in biological instances.
  • ECE3 Information Transmission.
    -
    Mimicking the optic nerve conduction and visual attention pathway [61], we process the panoramic imagery using a dynamic target recognition algorithm akin to frog or eagle vision. Recognition results then guide the narrow-angle cameras to track and acquire detailed target information.
  • ECE4 Motion Mechanism.
    -
    The wide-angle cameras stay fixed. The narrow-angle cameras can be mounted on series or parallel mechanisms, drawing from the characteristics of extraocular muscles and IP3-3. The platform’s degrees of freedom and angular range are tailored to practical requirements.
  • ECE5 Motion Control.
    -
    Based on specific requirements, including single or multi-target tracking and 3D measurements, the gimbals equipped with narrow-angle cameras have the capability to mimic chameleon-like independent eye movements, in addition to binocular synergistic eye movements, such as scanning and gazing.

5. Conclusions

This paper endeavors to introduce a novel bionic design paradigm, which integrates BioTRIZ with multi-factor coupling bionics. This methodology provides a general conceptual design framework for the bionic systems that necessitate the realization of a comprehensive process that involves engineering-to-biology mapping and biology-to-engineering inversion.
To intuitively explain the framework, the design process of a bionic vision system integrating multiple biological characteristics is presented. The design results show that this system not only draws inspiration from biological instances but also enhances the original capabilities of these living creatures. As a result, it offers a remarkable bionic solution for synchronously achieving omnidirectional surveillance, precise localization, and multi-target tracking in engineering applications.
Our future research will be directed towards refining this general framework to more precisely align with the design imperatives of specific engineering challenges. Additionally, we will concentrate on the concrete implementation strategies for the engineering conceptual model of the bionic active vision system designed in this study, meticulously selecting appropriate mechanical structures and equipment models to facilitate the acquisition of more quantifiable performance metrics.

Author Contributions

Formal analysis, B.W.; Methodology, B.W.; Project administration, D.Y.; Supervision, D.Y.; Writing—original draft, B.W.; Writing—review and editing, D.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China under Grant No. 51375368.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The paper is original in its contents and is not under consideration for publication in any other journals or proceedings. On behalf of all authors, the corresponding author states that there are no competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Cheong, H.; Shu, L. Using templates and mapping strategies to support analogical transfer in biomimetic design. Des. Stud. 2013, 34, 706–728. [Google Scholar] [CrossRef]
  2. Helms, M.; Vattam, S.S.; Goel, A.K. Biologically inspired design: Process and products. Des. Stud. 2009, 30, 606–622. [Google Scholar] [CrossRef]
  3. Mak, T.; Shu, L. Using descriptions of biological phenomena for idea generation. Res. Eng. Des. 2008, 19, 21–28. [Google Scholar] [CrossRef]
  4. Benyus, J.M. Biomimicry: Innovation Inspired by Nature; Harper Perennial: New York, NY, USA, 1997. [Google Scholar]
  5. Nagel, R.L.; Midha, P.A.; Tinsley, A.; Stone, R.B.; McAdams, D.A.; Shu, L. Exploring the use of functional models in biomimetic conceptual design. J. Mech. Des. 2008, 130, 121102. [Google Scholar] [CrossRef]
  6. Hoyos, C.M.; Fiorentino, C. Bio-utilization. Int. J. Des. Objects 2017, 10, 1. [Google Scholar]
  7. Snell-Rood, E.C.; Smirnoff, D. Biology for biomimetics I: Function as an interdisciplinary bridge in bio-inspired design. Bioinspir. Biomimetics 2023, 18, 052001. [Google Scholar] [CrossRef]
  8. Nkandu, M.I.; Alibaba, H.Z. Biomimicry as an alternative approach to sustainability. Archit. Res. 2018, 8, 1–11. [Google Scholar]
  9. Dash, S.P. Application of biomimicry in building design. Int. J. Civ. Eng. Technol. 2018, 9, 644–660. [Google Scholar]
  10. Vincent, J.F.; Bogatyreva, O.A.; Bogatyrev, N.R.; Bowyer, A.; Pahl, A.K. Biomimetics: Its practice and theory. J. R. Soc. Interface 2006, 3, 471–482. [Google Scholar] [CrossRef]
  11. Ren, L.; Liang, Y. Biological couplings: Classification and characteristic rules. Sci. China Ser. Technol. Sci. 2009, 52, 2791–2800. [Google Scholar] [CrossRef]
  12. Ren, L.; Liang, Y. Biological couplings: Function, characteristics and implementation mode. Sci. China Technol. Sci. 2010, 53, 379–387. [Google Scholar] [CrossRef]
  13. Snell-Rood, E. Interdisciplinarity: Bring biologists into biomimetics. Nature 2016, 529, 277–278. [Google Scholar] [CrossRef]
  14. Bian, Z.; Luo, S.; Zheng, F.; Wang, L.; Shan, P. Semantic reasoning of product biologically inspired design based on BERT. Appl. Sci. 2021, 11, 12082. [Google Scholar] [CrossRef]
  15. Devlin, J.; Chang, M.W.; Lee, K.; Toutanova, K. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv 2018, arXiv:1810.04805. [Google Scholar] [CrossRef]
  16. Deng, Z.; Lv, J.; Liu, X.; Hou, Y. Bionic Design Model for Co-creative Product Innovation Based on Deep Generative and BID. Int. J. Comput. Intell. Syst. 2023, 16, 8. [Google Scholar] [CrossRef]
  17. Kruiper, R.; Vincent, J.F.; Chen-Burger, J.; Desmulliez, M.P.; Konstas, I. A scientific information extraction dataset for nature inspired engineering. arXiv 2020, arXiv:2005.07753. [Google Scholar]
  18. Vandevenne, D.; Verhaegen, P.A.; Dewulf, S.; Duflou, J.R. A scalable approach for ideation in biologically inspired design. AI EDAM 2015, 29, 19–31. [Google Scholar] [CrossRef]
  19. Vandevenne, D.; Pieters, T.; Duflou, J.R. Enhancing novelty with knowledge-based support for Biologically-Inspired Design. Des. Stud. 2016, 46, 152–173. [Google Scholar] [CrossRef]
  20. Shu, L.; Cheong, H. A Natural Language Approach to Biomimetic Design In Biologically Inspired Design: Computational Methods and Tools; Springer: London, UK, 2014; pp. 29–61. ISBN 978-1-4471-5248-4. [Google Scholar] [CrossRef]
  21. Cheong, H.; Shu, L. Retrieving causally related functions from natural-language text for biomimetic design. J. Mech. Des. 2014, 136, 081008. [Google Scholar] [CrossRef]
  22. Rugaber, S.; Bhati, S.; Goswami, V.; Spiliopoulou, E.; Azad, S.; Koushik, S.; Kulkarni, R.; Kumble, M.; Sarathy, S.; Goel, A. Knowledge extraction and annotation for cross-domain textual case-based reasoning in biologically inspired design. In Proceedings of the Case-Based Reasoning Research and Development: 24th International Conference, ICCBR 2016, Atlanta, GA, USA, 31 October–2 November 2016; Proceedings 24. Springer: Berlin/Heidelberg, Germany, 2016; pp. 342–355. [Google Scholar] [CrossRef]
  23. Zhao, Y.; Baldini, I.; Sattigeri, P.; Padhi, I.; Lee, Y.K.; Smith, E. Data driven techniques for organizing scientific articles relevant to biomimicry. In Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society, New Orleans, LA, USA, 2–3 February 2018; pp. 347–353. [Google Scholar] [CrossRef]
  24. Nagel, J.K.; Nagel, R.L.; Stone, R.B.; McAdams, D.A. Function-based, biologically inspired concept generation. Ai Edam 2010, 24, 521–535. [Google Scholar] [CrossRef]
  25. Abdala, L.N.; Fernandes, R.B.; Ogliari, A.; Löwer, M.; Feldhusen, J. Creative contributions of the methods of inventive principles of TRIZ and BioTRIZ to problem solving. J. Mech. Des. 2017, 139, 082001. [Google Scholar] [CrossRef]
  26. Liu, X.; Li, J.; Chen, L.; Cheng, G. Bionic prototype acquisition incorporating extension and multi-level knowledge modeling. J. Mech. Eng. 2019, 55, 150–160. [Google Scholar]
  27. Mak, T.; Shu, L. Abstraction of biological analogies for design. Cirp Ann. 2004, 53, 117–120. [Google Scholar] [CrossRef]
  28. Cao, G.; Sun, Y.; Tan, R.; Zhang, J.; Liu, W. A function-oriented biologically analogical approach for constructing the design concept of smart product in Industry 4.0. Adv. Eng. Inform. 2021, 49, 101352. [Google Scholar] [CrossRef]
  29. Hou, X.T.; Liu, W.; Cao, G.Z.; Wu, Z.f.; Guo, Z.B. Research on design method of function combination product based on multi biological effects. Chin. J. Eng. Des. 2017, 24, 18–26. [Google Scholar]
  30. Bai, Z.; Song, M.; Zhang, X.; Zhang, J. Biological Prototype Acquisition Based on Biological Coupling in Bionic Design. Appl. Bionics Biomech. 2022, 2022, 8458243. [Google Scholar] [CrossRef]
  31. Heesy, C.P. Seeing in stereo: The ecology and evolution of primate binocular vision and stereopsis. Evol. Anthropol. Issues News Rev. 2009, 18, 21–35. [Google Scholar] [CrossRef]
  32. Read, J.C. Binocular vision and stereopsis across the animal kingdom. Annu. Rev. Vis. Sci. 2021, 7, 389–415. [Google Scholar] [CrossRef]
  33. Tyrrell, L.P.; Fernández-Juricic, E. Avian binocular vision: It’s not just about what birds can see, it’s also about what they can’t. PLoS ONE 2017, 12, e0173235. [Google Scholar] [CrossRef]
  34. Potier, S.; Mitkus, M.; Kelber, A. Visual adaptations of diurnal and nocturnal raptors. In Seminars in Cell & Developmental Biology; Elsevier: London, UK, 2020; Volume 106, pp. 116–126. [Google Scholar] [CrossRef]
  35. Nilsson, D.E. The diversity of eyes and vision. Annu. Rev. Vis. Sci. 2021, 7, 19–41. [Google Scholar] [CrossRef]
  36. Zhang, H.; Lee, S. Robot bionic vision technologies: A review. Appl. Sci. 2022, 12, 7970. [Google Scholar] [CrossRef]
  37. Zhai, G.; Zhang, W.; Hu, W.; Ji, Z. Coal mine rescue robots based on binocular vision: A review of the state of the art. IEEE Access 2020, 8, 130561–130575. [Google Scholar] [CrossRef]
  38. Zhang, S.; Li, B.; Ren, F.; Dong, R. High-precision measurement of binocular telecentric vision system with novel calibration and matching methods. IEEE Access 2019, 7, 54682–54692. [Google Scholar] [CrossRef]
  39. Liu, Y.; Zhu, D.; Peng, J.; Wang, X.; Wang, L.; Chen, L.; Li, J.; Zhang, X. Real-time robust stereo visual SLAM system based on bionic eyes. IEEE Trans. Med. Robot. Bionics 2020, 2, 391–398. [Google Scholar] [CrossRef]
  40. Hu, L.; Shen, C. A study of visual servo system based on binocular camera. In Proceedings of the 5th International Conference on Electrical Engineering and Automatic Control; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1105–1112. [Google Scholar] [CrossRef]
  41. Hu, P.; Hao, X.; Li, J.; Cheng, C.; Wang, A. Design and implementation of binocular vision system with an adjustable baseline and high synchronization. In Proceedings of the 2018 IEEE 3rd International Conference on Image, Vision and Computing (ICIVC), Chongqing, China, 27–29 June 2018; IEEE: New York, NY, USA, 2018; pp. 566–570. [Google Scholar] [CrossRef]
  42. Xu, X.; Sun, Y.; Duan, H.; Deng, Y.; Zeng, Z. Maritime Target Saliency Detection for UAV Based on the Stimulation Competition Selection Mechanism of Raptor Vision. Guid. Navig. Control. 2023, 3, 2350012. [Google Scholar] [CrossRef]
  43. Jia, H.; Li, S. Scene Analysis Based on Horse Vision System. In Proceedings of the MVA 2011 IAPR Conference on Machine Vision Applications, Nara, Japan, 13–15 June 2011; pp. 267–270. [Google Scholar]
  44. Xu, Y.; Liu, C.; Cui, H.; Song, Y.; Yue, X.; Feng, L.; Wu, L. Environment Perception with Chameleon-Inspired Active Vision Based on Shifty Behavior for WMRs. Appl. Sci. 2023, 13, 6069. [Google Scholar] [CrossRef]
  45. Wang, B.; Zhang, B.; Yu, D. The Bionic Research on Avian Visual Structure in Multi-Target Monitoring. In Proceedings of the 5th International Conference on Advanced Design and Manufacturing Engineering, Shenzhen, China, 19–20 September 2015; Atlantis Press: Paris, France, 2015; pp. 340–346. [Google Scholar] [CrossRef]
  46. Bogatyreva, O.; Shillerov, A.; Bogatyrev, N. Patterns in TRIZ contradiction matrix: Integrated and distributed systems. In Proceedings of the 4th ETRIA Symposium, Florence, Italy, 3–5 November 2004; pp. 35–42. [Google Scholar]
  47. Batemanazan, V.; Jaafar, A.; Kadir, R.A.; Nayan, N.M. Improving usability with TRIZ: A review. In Proceedings of the Advances in Visual Informatics: 5th International Visual Informatics Conference, IVIC 2017, Bangi, Malaysia, 28–30 November 2017; Proceedings 5. Springer: Berlin/Heidelberg, Germany, 2017; pp. 625–635. [Google Scholar] [CrossRef]
  48. Lv, B.; Xue, Z.; Wei, H.; Li, Y. Exploration of Design Methods Based on Bionic Functional Modules. In Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2021; Volume 1939, p. 012078. [Google Scholar] [CrossRef]
  49. Yongquan, Y.; Ying, H.; Minghui, W. The related matter-elements in extension detecting and application. In Proceedings of the Third International Conference on Information Technology and Applications (ICITA’05), Sydney, Australia, 4–7 July 2005; IEEE: New York, NY, USA, 2005; Volume 1, pp. 411–414. [Google Scholar] [CrossRef]
  50. Jones, M.P.; Pierce, K.E., Jr.; Ward, D. Avian vision: A review of form and function with special consideration to birds of prey. J. Exot. Pet Med. 2007, 16, 69–87. [Google Scholar] [CrossRef]
  51. Pettigrew, J.D. Evolution of Binocular Vision. In Visual Neuroscience; Cambridge University Press: Cambridge, UK, 1986; pp. 208–222. ISBN 0521258294. [Google Scholar]
  52. McComb, D.; Tricas, T.; Kajiura, S. Enhanced visual fields in hammerhead sharks. J. Exp. Biol. 2009, 212, 4010–4018. [Google Scholar] [CrossRef] [PubMed]
  53. Clarke, P.; Whitteridge, D. The projection of the retina, including the ‘red area’, on to the optic tectum of the pigeon. Q. J. Exp. Physiol. Cogn. Med. Sci. Transl. Integr. 1976, 61, 351–358. [Google Scholar] [CrossRef]
  54. O’Rourke, C.T.; Hall, M.I.; Pitlik, T.; Fernández-Juricic, E. Hawk eyes I: Diurnal raptors differ in visual fields and degree of eye movement. PLoS ONE 2010, 5, e12802. [Google Scholar] [CrossRef]
  55. Gregory-Evans, C.Y.; Gregory-Evans, K. Foveal hypoplasia: The case for arrested development. Expert Rev. Ophthalmol. 2011, 6, 565–574. [Google Scholar] [CrossRef]
  56. Fite, K.V.; Lister, B.C. Bifoveal vision in Anolis lizards. Brain, Behav. Evol. 1981, 19, 144–154. [Google Scholar] [CrossRef] [PubMed]
  57. Tucker, V.A. The deep fovea, sideways vision and spiral flight paths in raptors. J. Exp. Biol. 2000, 203, 3745–3754. [Google Scholar] [CrossRef] [PubMed]
  58. Pettigrew, J.D.; Collin, S.P.; Ott, M. Convergence of specialised behaviour, eye movements and visual optics in the sandlance (Teleostei) and the chameleon (Reptilia). Curr. Biol. 1999, 9, 421–424. [Google Scholar] [CrossRef]
  59. Land, M.F. Eye movements of vertebrates and their relation to eye form and function. J. Comp. Physiol. A 2015, 201, 195–214. [Google Scholar] [CrossRef] [PubMed]
  60. Waldvogel, J.A. The bird’s eye view. Am. Sci. 1990, 78, 342–353. [Google Scholar]
  61. Tanner, J.; Itti, L. A top-down saliency model with goal relevance. J. Vis. 2019, 19, 11. [Google Scholar] [CrossRef]
Figure 1. Bio-inspired design framework. The engineering issue is mapped onto the bio-space using BioTRIZ. Once the BioTRIZ solution and relevant biological instances are identified, the EM-MCB is constructed by examining the coupling elements that impact the biological function. Subsequently, the biological solution is translated back into the engineering domain to derive the engineering conceptual model.
Figure 1. Bio-inspired design framework. The engineering issue is mapped onto the bio-space using BioTRIZ. Once the BioTRIZ solution and relevant biological instances are identified, the EM-MCB is constructed by examining the coupling elements that impact the biological function. Subsequently, the biological solution is translated back into the engineering domain to derive the engineering conceptual model.
Applsci 14 06021 g001
Figure 2. BioTRIZ implementation. When tackling an invention problem with BioTRIZ, the initial step involves modeling the engineering issue as a BioTRIZ issue, utilizing OFs to articulate the conflicts. Subsequently, IPs that align with the requirements are acquired by referencing the BioTRIZ contradiction matrix. Lastly, biological solutions are derived by amalgamating the IPs with prototypes present in the biological database.
Figure 2. BioTRIZ implementation. When tackling an invention problem with BioTRIZ, the initial step involves modeling the engineering issue as a BioTRIZ issue, utilizing OFs to articulate the conflicts. Subsequently, IPs that align with the requirements are acquired by referencing the BioTRIZ contradiction matrix. Lastly, biological solutions are derived by amalgamating the IPs with prototypes present in the biological database.
Applsci 14 06021 g002
Figure 3. Illustration of the bionic binocular vision system parameters. The viewing angle of an image sensor represents the monocular vision field range. The direction of optical axis dictates the extent of binocular overlap. The overall vision field is comprised of the left and right lateral vision fields and the binocular overlap. Regions beyond these are designated as blind areas.
Figure 3. Illustration of the bionic binocular vision system parameters. The viewing angle of an image sensor represents the monocular vision field range. The direction of optical axis dictates the extent of binocular overlap. The overall vision field is comprised of the left and right lateral vision fields and the binocular overlap. Regions beyond these are designated as blind areas.
Applsci 14 06021 g003
Figure 4. Biology to engineering inversion. Once a solution is derived from the biology domain, the translation of biological properties into engineering realizations becomes necessary. Initially, appropriate biological prototypes are identified as bionic targets, and their distinctive features, such as structure, morphology, and neural mechanism, are thoroughly analyzed. Subsequently, the EM-MCB is constructed using these BCEs and analogous principles in the engineering domain. Eventually, a comprehensive bionic system and an engineering conceptual model with several modules can be created.
Figure 4. Biology to engineering inversion. Once a solution is derived from the biology domain, the translation of biological properties into engineering realizations becomes necessary. Initially, appropriate biological prototypes are identified as bionic targets, and their distinctive features, such as structure, morphology, and neural mechanism, are thoroughly analyzed. Subsequently, the EM-MCB is constructed using these BCEs and analogous principles in the engineering domain. Eventually, a comprehensive bionic system and an engineering conceptual model with several modules can be created.
Applsci 14 06021 g004
Figure 5. The Biological Coupling Model of Binocular Vision for Vertebrates. In this model, BCEs represent the factors that affect the visual perception of vertebrates. These elements encompass binocular configuration, which involves the eyes’ positional relationship, head position, and orientation, as well as eyeball shape and structure. The optic nerve conducts visual signals to the cerebral cortex, forming the visual pathway, while extraocular muscles enable eye rotations for sweeping and gazing movements.
Figure 5. The Biological Coupling Model of Binocular Vision for Vertebrates. In this model, BCEs represent the factors that affect the visual perception of vertebrates. These elements encompass binocular configuration, which involves the eyes’ positional relationship, head position, and orientation, as well as eyeball shape and structure. The optic nerve conducts visual signals to the cerebral cortex, forming the visual pathway, while extraocular muscles enable eye rotations for sweeping and gazing movements.
Applsci 14 06021 g005
Figure 6. The coupling model of the bionic active vision system. This model encompasses several ECEs and their features. The binocular configuration involves the relative positioning of the eyes and orientations. Image sensors can be a variety of different wavelengths and vision fields. Information transmission relies on visual perception algorithms and communication with higher-level computational units. The motion mechanism enables the rotation and shifting of image sensors, while motion control achieves sweeping and gazing movements.
Figure 6. The coupling model of the bionic active vision system. This model encompasses several ECEs and their features. The binocular configuration involves the relative positioning of the eyes and orientations. Image sensors can be a variety of different wavelengths and vision fields. Information transmission relies on visual perception algorithms and communication with higher-level computational units. The motion mechanism enables the rotation and shifting of image sensors, while motion control achieves sweeping and gazing movements.
Applsci 14 06021 g006
Figure 7. Engineering conceptual model of the bionic active vision system. The schematic provides a comprehensive overview of the mechatronic system structure, driver, and algorithms. It features wide-angle cameras O 1 and O 2 mounted back-to-back, offering a 360° panoramic view. This design mimics prey animals’ eye structure, enabling quick object localization within the panoramic range using detection algorithms. Additionally, narrow-angle cameras C 1 , C 2 , C 3 , and C 4 are positioned at 90° intervals and equipped with motion mechanisms for orientation changes. Once a tracking target is identified, one or two of these narrow-angle cameras are activated for tracking purposes or 3D measurement. Advanced Intelligent Decision is used to mimic the visual cortex of the brain. When considering the bionic active vision system as a perceptual module, the decision unit is capable of making high-level decisions for mechatronic systems, such as robots, through the analysis of information abstracted from the visual perception module.
Figure 7. Engineering conceptual model of the bionic active vision system. The schematic provides a comprehensive overview of the mechatronic system structure, driver, and algorithms. It features wide-angle cameras O 1 and O 2 mounted back-to-back, offering a 360° panoramic view. This design mimics prey animals’ eye structure, enabling quick object localization within the panoramic range using detection algorithms. Additionally, narrow-angle cameras C 1 , C 2 , C 3 , and C 4 are positioned at 90° intervals and equipped with motion mechanisms for orientation changes. Once a tracking target is identified, one or two of these narrow-angle cameras are activated for tracking purposes or 3D measurement. Advanced Intelligent Decision is used to mimic the visual cortex of the brain. When considering the bionic active vision system as a perceptual module, the decision unit is capable of making high-level decisions for mechatronic systems, such as robots, through the analysis of information abstracted from the visual perception module.
Applsci 14 06021 g007
Table 1. BioTRIZ contradiction matrix [10]. Select one OF that needs to be improved from the six arranged vertically, and then identify the OF that will be deteriorative from the six listed horizontally. The intersection of the selected row and column will indicate the corresponding IPs, which constitutes the BioTRIZ solution.
Table 1. BioTRIZ contradiction matrix [10]. Select one OF that needs to be improved from the six arranged vertically, and then identify the OF that will be deteriorative from the six listed horizontally. The intersection of the selected row and column will indicate the corresponding IPs, which constitutes the BioTRIZ solution.
FieldsSubstanceStructureSpaceTimeEnergyInformation
Substance13 15 17 20 31 401–3 15 24 261 5 13 15 3115 19 27 29 303 6 9 25 31 353 25 26
Structure1 10 15 191 15 19 24 34101 2 41 2 41 3 4 15 19 24 25 35
Space3 14 15 252–5 10 15 194 5 36 14 171 19 291 3 4 15 193 15 21 24
Time1 3 15 20 25 381–4 6 15 17 191–4 7 382 3 11 20 263 9 15 20 22 251–3 10 19 23
Energy1 3 13 14 17 25 311 3 5 6 25 36 401 3 4 15 253 10 23 25 353 5 9 22 25 32 371 3 4 15 16 25
Information1 6 221 3 6 18 22 24 32 34 403 20 22 25 332 3 9 17 221 3 6 22 323 10 16 23 25
Table 2. BioTRIZ solutions and biological instances. The table comprises three columns that present the results obtained in engineering-to-biology mapping, arranged from left to right. Column 1 exhibits the BioTRIZ issues (OFs), column 2 displays the BioTRIZ solutions (IPs), and column 3 illustrates the corresponding biological instances. This table serves as complete biological solution for engineering-to-biology inversion.
Table 2. BioTRIZ solutions and biological instances. The table comprises three columns that present the results obtained in engineering-to-biology mapping, arranged from left to right. Column 1 exhibits the BioTRIZ issues (OFs), column 2 displays the BioTRIZ solutions (IPs), and column 3 illustrates the corresponding biological instances. This table serves as complete biological solution for engineering-to-biology inversion.
OFsIPsSupporting Biological Instances
Improved:
Space
Deteriorated:
Information
IP3
Local Quality
IP15
Dynamics
IP21
Skipping
IP24
Intermediary
  • Six-eyed spookfish’s eyes: Two pairs auxiliary, one pair primary.
  • Jumping spider’s eyes: Eight total, two for stereoscopic vision, six for omnidirectional.
  • Brownsnout spookfish’s auxiliary eyes: Create clear images via light reflection and focusing.
  • Whirligig beetle’s eyes: One submerged underwater to hunt for prey, one above the water to keep watch for predators.
  • Starling’s eyes: One for overall scene, one for details.
  • Anableps anableps’ lens: Thicker lower part for underwater gaze, upper part for air scanning.
  • Nocturnal gecko’s multifocal lens: Different parts of the lens focus a different range of wavelengths onto the eye’s light-sensitive cells.
  • Scallop’s retinas: One responds to light, the other to sudden darkness.
  • Hammerhead shark’s eyes: Located on “hammer” sides, enhancing stereoscopic perception and wide view.
  • Ghost crab’s eyes: On movable stalks, providing full range of vision.
  • Chameleon’s eyes: Rotate freely, switching between monocular and binocular vision.
  • Vertebrate’s reflective tapetum: Improves low-light visual sensitivity.
  • Horseshoe crab’s eyes: Sensitive to polarized light, reducing sun glare.
  • Jewel scarab beetle’s eyes: Distinguish polarized from unpolarized light.
  • Lobster’s eyes: Feature square tubes focusing reflected light on retina.
  • West Indian wood snake’s eyes: Release blood to deter predators.
  • Locust’s eyes: Recognize only movement interfering with flight path.
  • Reef heron’s head position: Adjusts for surface light refraction, maintaining relationship between real and apparent prey depth.
  • Kestrel’s eyes: Four reflexes enable focus during body movement.
  • Dragonfly’s eyes: Capture up to 300 images/second, enhancing movement perception and compensating for lack of visual sharpness.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, B.; Yu, D. Combining BioTRIZ and Multi-Factor Coupling for Bionic Mechatronic System Design. Appl. Sci. 2024, 14, 6021. https://doi.org/10.3390/app14146021

AMA Style

Wang B, Yu D. Combining BioTRIZ and Multi-Factor Coupling for Bionic Mechatronic System Design. Applied Sciences. 2024; 14(14):6021. https://doi.org/10.3390/app14146021

Chicago/Turabian Style

Wang, Bingxin, and Dehong Yu. 2024. "Combining BioTRIZ and Multi-Factor Coupling for Bionic Mechatronic System Design" Applied Sciences 14, no. 14: 6021. https://doi.org/10.3390/app14146021

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop