About Problem Solving Methods and Thinking and Design of Experiments
This presentation will cover the basic concept of problem solving, various methods of problem analysis, hypothesis formulation and estimation for problem solving, and specific steps of problem solving.
Next, we will discuss the “billiard method” of problem analysis, which is a method for analyzing problems, the systems thinking approach, and the KPI, KGI, OKR, and other methods that are often used in corporate improvement activities.
Next, we will discuss Fermi estimation for quantifying issues and target values, and then we will discuss deduction, induction, projection, analogy, abduction, and other reasoning methods for formulating hypotheses to solve problems, followed by methods for verifying these hypotheses without falling into the “confirmation bypass”.
Finally, PDCA, a specific problem-solving approach, and the concept of experimentation within PDCA are described based on causal inference methods.
Topics
Algorithmic Thinking (Algorithmic Thinking) refers to the ability or process of thinking about logical procedures and approaches in problem solving and task execution. Having algorithmic thinking is an important skill that can help when dealing with a variety of complex challenges. ‘Problem partitioning’ in algorithmic thinking is the process of dividing a large problem into a number of smaller sub-problems, an approach that allows complex problems to be broken down into manageable units, making large tasks more understandable and allowing each sub-problem to be solved individually and efficiently. Problem Partitioning can be seen as the first step in the general problem-solving process.
Domain-Driven Design (DDD) is a software design methodology based on an understanding of the business domain. This section describes DDD.
- Abstraction-based approaches in summarisation and AI-based communication support
‘Overview of automatic summarisation technology, algorithms and examples of implementation’ describes AI-based summarisation technology. Automatic summarisation technology is widely used in information retrieval, information processing, natural language processing, machine learning and other fields to compress large text documents and texts into short and to the point forms, and to facilitate the understanding of summarised information. It can be broadly divided into two types: extractive summarisation and abstractive summarisation. Here, we would like to consider a qualitative approach to abstractive summarisation based on the ‘one-word summarisation technique’.
Fermi estimation is a method of logically decomposing a problem of interest, obtaining partial answers using assumptions, and synthesizing them to derive a solution to the problem. Using this method, a reasonable answer can be derived even when there is uncertainty, lack of data, or inaccurate information, and it is a way for people without scientific knowledge to make reasonable inferences from the events around them. This method was often used by physicist Enrico Fermi, hence the name “Fermi Estimation”.
As described in “Zen Philosophy and History, Mahayana Buddhism, Taoism, and Christianity,” Zen is a religion whose goal is enlightenment. In Zen Buddhism, enlightenment means “to become aware of the Buddha-nature, which is the true nature of all living beings,” and Buddha-nature is “the ability to recognize things that are beyond verbal understanding. Here I would like to discuss meditation and enlightenment, which occupy an important place in Zen.
As described in “Meditation, Enlightenment, and Problem Solving,” mindfulness meditation and Zen vipassana meditation are “insight meditation” that emphasize “awareness” and “attention as it is,” an approach that focuses on developing concentration and observing things as they are. A similar approach is also described in “Invitation to Cognitive Science. In cognitive science, which is also discussed in the “Reading Notes,” it is called “metacognition,” which refers to an individual’s way of thinking and perceiving knowledge, and is thought of as an understanding of what one knows and understands.
AI technology offers a feasible approach to various aspects of such metacognition.
This section discusses the “billiard method” of questioning and the importance of not making “category mistakes” when organizing them. These methods allow for a comprehensive systematic analysis of issues.
- Scientific Thinking(1) What is science?
The first step is to clarify the difference between “the language that science speaks” and “the language that science speaks. The “language of science” refers to “scientific concepts” such as DNA and entropy, which are defined within scientific theories. On the other hand, “words that speak science” are “meta-scientific concepts” that appear in various theories, such as theories, hypotheses, laws, and equations, and their meaning must be precisely understood in order to think scientifically.
The first two of these “meta-scientific concepts” are “theory” and “fact. Scientific theories and hypotheses are based on the assumption that the world is indeterminate and ambiguous (100% truth does not exist or will take a lifetime to know), and are made from a relative perspective of whether they are better theories/hypotheses rather than absolute 1 or 0.
Here, the expected functions of science are “to predict,” “to apply,” “to explain,” etc. Among these, I will discuss “explaining,” which appears frequently in the preceding paragraphs.
- Scientific Thinking(2) Inference patterns for hypothesis testing
The method of tracing the relationship between various facts (inference pattern) described in the previous section includes the deductive method, which derives another proposition from a certain statement or group of propositions, and four non-deductive methods of induction, projection, analogy, and abduction, which are induction methods other than the deductive method.
Here, deduction leads from the premises that “all fishes have gills” and “eels are fishes” to the conclusion that “therefore eels have gills” (the truth or falsehood of the premises is carried over to the conclusion and does not increase the overall amount of information) and
The common denominator of the four non-deductive inferences is “probable,” which is the opposite of “necessary.” Non-deductive inferences are based on the premise that the conclusion is not necessarily correct, even if the premise is correct. (Deductive reasoning is necessary, meaning that if the premises are correct, the conclusion will always be correct.) This means that non-deductive reasoning requires more information in the conclusion.
- Scientific Thinking(3) Confirmation Bypass and Quadrant Tables
After performing hypothetical reasoning, it is necessary to verify whether the hypothesis is correct or not. Although the word “verification” is sometimes used in the sense of a detailed examination, the act of ascertaining whether or not a hypothesis is correct is what it means in the first place.
To think about verification, consider, as an example, a case in which one says a sequence of three natural numbers and answers yes if the sequence fits a rule in one’s mind and no if it does not, and then gives some hint to deduce the answer. (4-card problem (Wason’s choice))
In such cases, it is not enough to be given only the number of correct examples, but to find a better hypothesis, you must present something that does not fit the hypothesis (a counter-example), or you will never reach the correct answer. Many people, however, fall into the trap of looking only for examples that fit there (thinking only of the hypothesis they are predicting) when they want to be sure that this is not the case. This is called the “confirmation bypass.
- About KPI, KGI, and OKR (1) Methods for clarifying issues
Three-letter abbreviations such as KPI (Key Performance Indicators) and KGI (Key Goal Indicator) appear in the problem-solving scene. In this article, I would like to discuss them.
First, let’s talk about “KPI.” KPI can be interpreted and used in various ways by different people. In some cases, KPIs are simply used as “indicators to manage results,” while in other cases, KPIs are used to express “target values themselves. KPIs are defined as “a means of identifying points of focus and measuring progress toward achieving (business) goals without waste.
- About KPI, KGI, and OKR (2) Methods for clarifying issues
OKR stands for “Objective and Key Result,” a framework used in Silicon Valley and other leading companies in the United States and Europe.
Unlike KPIs, the goal, the objective, is expressed qualitatively, while the key result indicators are set quantitatively. Here, “objectives” are defined as “what do we want to achieve? Where are we headed? and “Key Result Indicators” are “How will we achieve our objectives? How do we know we are getting closer to our objectives? How do we know we are getting closer to our objectives?
Since milestones toward the objective are “key result indicators,” the pace of the project can be reviewed by knowing how well it is progressing.
- Specific examples of KPIs Specific examples of KPIs
The key points in setting KPIs are (1) to secure business performance, sustain the company, and protect employment (from the perspective of the company and its employees), and (2) to contribute as a public institution of society through customer satisfaction and results distribution (as required by investors and society).
To accomplish this, the following seven steps will be implemented. (4) Establish the three pillars of customer satisfaction (Q: quality, C: cost, D: delivery time), (5) Continue to provide quality products and services to gain customer satisfaction by translating knowledge into action, and (6) Customer satisfaction is the key to corporate profit. The profits are used to pay taxes, dividends, internal reserves, financial bonuses, and reinvestment, etc. (7) Repeat continuous improvement
- Problem Solving About PDAs
The PDCA cycle of PLAN, DO, CHECK, and ACTION is a long-established framework. This framework is used by many companies as a management method to improve manufacturing processes and to manage teams and projects.
Specifically, this framework consists of the following cycles: analyze the current situation, develop a hypothesis based on the hypothesis, consider target values to verify the hypothesis, develop and execute an action plan to verify the target values, verify/analyze how the results compare to the target values, and consider the next hypothesis based on the results of that analysis.
Systems thinking is a methodology for understanding complex phenomena and problems as a whole by understanding the related elements and interactions, analyzing the behavior of the entire system and the causes of problems, and deriving remedial measures. Systems Thinking enables a deeper understanding of phenomena and problems by viewing things not as a single element or part, but as a “system” in which multiple elements and parts interact and function, and by grasping the whole picture.
Fermi estimation (Fermi estimation) is a method for making rough estimates when precise calculations or detailed data are not available and is named after the physicist Enrico Fermi. Fermi estimation is widely used as a means to quickly find approximate answers to complex problems using logical thinking and appropriate assumptions. In this article, we will discuss how this Fermi estimation can be examined using artificial intelligence techniques.
Kaisha Shikiho: Sangyo Chizu” is one of the special issues of Kaisha Shikiho, a corporate information magazine published by Nihon Keizai Shimbun. Published annually in March and September, and available at bookstores and online for about several thousand yen per copy, this map-like information magazine summarizes market trends and company performance evaluations for each industry. In these, issues for each industry and company are also analyzed. When considering DX, various specific issues can be extracted by extracting issues for companies in similar business categories and generalizing issues for industries in specific domains.
This blog discusses various problem-solving methods in “Problem Solving Methods, Thinking and Design of Experiments. In this blog, we discuss the quantification methods for problem solving in “KPI KGI OKRs” and the use of abstraction steps as described in “Concrete and Abstract – Semantics and Description in Natural Language”. The problem-solving approach is in fact a problem-solving technique that is now being used to identify the essential problem.
Such an approach to problem solving was actually compiled by a man called Sun Tzu about 2,500 years ago. In this article, I would like to explain what Sun Tzu is based on “NHK 100 Minutes de Meitaku Lao Tzu x Sun Tzu”. The military code “Sun Tzu” was born about 2,500 years ago in China during the Spring and Autumn period.
Sun Tzu’s ideas can be considered the root of many modern approaches to problem solving. Although the subject matter is war, if we replace it with problem solving, we find that “the objective is important, not the means (war),” “before conducting problem solving (war), quantification and examination should be conducted from various perspectives,” “the goal of problem solving (war) should be clearly defined and when to stop should be made clear,” “the goal of problem solving (war) should not be done for the time being, but 80% should be done at the pre-planning stage,” and “the goal should not be done for the time being, but 80% should be done at the end of the war. The current consultants’ ideas such as, “We must not just do it for now, but must be in such a state that we are 80% ready at the pre-planning stage, and we must not fight without a plan or without a chance of victory,” are scattered throughout the book. The following is a specific description of the contents of Sun Tzu’s book.
The world is changing, and the types and orders applied to the world should not be operated without thinking, but should be changed in accordance with the changes in the world. In order to do so, it is necessary to consider the meaning of order and type in the first place, and furthermore, in considering the meaning, it is important to consider the purpose, as described in “Life as Information – Purpose and Meaning”. Order to organize a chaotic world and freedom to change it are both important elements.
In philosophy, we think about clarifying and sharing “what is good and why it is good” in dialogue with others. In order to come to a common understanding, it is important to ask questions. Specific steps include giving examples, ascertaining meaning, considering common elements, and thinking about the reasons why something is valuable.
Various methodologies have been examined for causality as causal inference (statistical causal inference) in data science. First, we will discuss the definitions of correlation and causality.
Correlation is defined as a linear relationship between two variables such that when the value of one variable is large, the value of the other variable is also large (or small).
A causal relationship is said to exist between X and Y when factor Y also changes when factor X changes. Factor X is then called the cause and factor Y is called its result. From here on, variables that indicate causes are called causal variables and variables that indicate results are called outcome variables (outcomes). Manipulating and changing a factor is called intervention or treatment. An intervention can be imagined as providing a treatment or showing an advertisement.
There are many ways to examine causality, adjusting for the effects of confounding factors, but the best way to test causality is through experimental studies. In the field of causal inference, an experimental study is one in which participants are randomly assigned to receive or not receive an intervention, and is called a randomized controlled trial (RCT). In contrast, studies in which the assignment is not random are called observational studies.
In real-life problems, it is unlikely that we know in advance what all the confounding factors are. It is better to assume that there are always unknown confounders. Furthermore, even if a confounding factor is known, it is not always measured, and for a variety of technical and cost reasons, it is not always possible to measure it. In other words, it is best to assume that there are confounders that have not been measured.
Random assignment in experimental studies (RCTs) allows causal inferences to be made without having to worry about any confounding factors, including unmeasured or unknown confounding factors as described above. This is a major advantage of RCTs over causal inference from observational studies. On the other hand, causal inference from observational studies generally requires that confounders be measured. Although some of the methods described below have the potential to address unmeasured or unknown confounders if the conditions are met, the two most basic methods, stratified analysis and regression model analysis, assume that confounders have been measured.
The concrete and abstract perspective is a very important viewpoint when taking a semantic approach to natural language processing, structural machine learning, or machine learning that can be explained. The world around us consists of two opposing concepts: “concrete” and “abstract. The word “concrete” is most often used when explaining something in an easy-to-understand way by saying, “To put it concretely…” or “To put it abstractly…” or “Could you be more specific? When someone does not understand what you are saying, you can say, “Could you be a little more specific? Conversely, the word “abstract” is used in contexts such as “I don’t understand what that person is saying because it is abstract.
- Nine screen method of the Tories
- 101 Design Methods
- Business books you can understand just by looking at them
コメント