Abstract
The decision-making (DM) processes are increasingly playing an important role in business management. The optimization in operational, tactical and strategic decisions can provide to the firm a position of advantage over the competitors. DM is defined as a criteria selection method to choose the best alternative that lets to the firm get the objective. Some of the most important stakeholders involved in the DM processes are described. In this chapter, Decision Trees allow taking into account all the possible alternatives, but it requires a large computational cost, that will be depend of the order of variables. In order to carry out this task as easy as possible, different ranking methods are explained. It is important to say that there is not a method to order the variable providing the best solution in all the cases. This is why several methods are presented such as Top-down-left-right, Depth First Search, The Breath First Search, Level Method and AND Method. Once all the alternatives have been taken into account, Binary Decision Diagrams provide an analytical expression to approach the interrelation between all variables what becomes an optimization function. The optimization function will be usually subject to different constraints related to the availability of resources of the firm. Once the optimization has been carried out, it is easier to establish an optimal strategy to allocate properly the available resources. In this chapter, a case study is presented to facilitate the reader’s understanding of the proposed method hereby. Finally, some conclusions are gathered from the application of this method.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
- Binary Decision Diagrams (BDD)
- Methods Selection Criteria
- Breath-first Search (BFS)
- Level Method
- Logical Decision Trees (LDT)
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
Decision Making (DM) is defined as a criteria selection method, e.g. when a person wakes up in the morning and decides to bike instead of driving to work. These decisions can be made in the short, medium and long term. All the firms need to discern about the decisions that are really important or not from a particular perspective. A correct decision-making in any firm can become a competitive advantage.
In a DM scenario there is an event that is, or not, desired, which entail a path among the different alternatives that lets to the firm get the objective [1]. Generally it is required the optimal situation, i.e. the one which will provide the best results from the profit, cost, safety,… point of view [2]. It is possible employing a correct criterion for choosing the best scenario [3].
DM is defines by Harris as [4]:
The research of identifying and choosing alternatives based on the decision-maker weighs/values and preferences. Make a decision entail there are several alternatives to be considered, and not only identify the alternatives is sought but also to choose the one that best fits the aims, restrictions, etc. is desired .
DM consists on the transformation process from data to proceedings [5]. Data collection becomes a strategy task for DM in order to get the proceedings, and they help to keep improving the DM.
There are several alternatives to classify the decisions performed in a business. Figure 1 shows a classification of the decisions that can be made in a business environment.
Operational decisions aim to reach the strategic decisions. Wrong operational decisions have not far-reaching implications for the future and may be fixed easily.
Tactical decisions occur with more frequency than the operational decisions. They are control by a procedure and routines. Any wrong tactical decisions may bring troubles to the business.
Strategic decisions are done in long period, where there is not enough dataset and it is critical in the future of the business, having a wrong strategic decisions fatal consequences.
According to the time period, programmed decisions are those repeated frequently in a business, where a procedure is developed to carry out every time it occurs. On the other hand, non-programmed decisions are those emerged unexpectedly. Usually involve a high degree of difficulty and they must be tackled by experts in the field.
The advances in the technology and information help to the firms to develop their own software in order to find good DM. Nevertheless it is recommended to consider algorithms in order to find the optimal DM.
2 DM Process
The diagram of a DM can be performed by decision trees. This is achieved having a graphical representation of those situations that need to be improved. Although measures of importance can be applied to decision trees, the use of Binary Decision Diagrams (BDD) involves a reduction in the computational cost for quantitative resolution, among other improvements. In addition, the cut sets obtained from the BDD will be the basis for the construction of the heuristic method developed for the analysis proposed in this chapter.
The following main scenarios can be distinguished according to the information available in the DM process:
-
DM under certainty: The problem is entirely known, i.e. all possible states for a basic cause (BC) are known and any consequences of each decision can be completely achieved.
-
DM under risk: Implies partial information and some are stochastic. This will be the scenario considered in this chapter.
-
DM under uncertainty: Information about the main problem (MP) and BCs is not complete and part of the information is missed.
Figure 2 shows a flow chart regarding to the main steps for a decision maker considering rational and logical arguments that support their decision.
2.1 The Decision Maker
The decision maker is the person, system or organization that takes a decision. All decisions and assessments will be influenced by the decision makers. Any decision-maker should be essential to have some skills that can be resumed in experience, good judgment, creativity and quantitative skills. The first three skills are personals, and the last one is supported by existing methods and support systems for DM in order to choose, considering different scenarios, and to help to the decision maker to decide about best DM. In this field the paper presents and describes a quantitative method to support the DM process.
2.2 Constraints and Requirements
In any DM process there are constrains or requirements to consider, e.g. existing resources, available budget, environmental precautions, social issues, legal provisions, etc. Generally the constraints are exogenous and not access in order to incorporate to the mathematical or empirical models, i.e. an endogenization of constraints should be carried out in order to consider them. This endogenization involves conceptualizing the constraints as goals of the decision maker, i.e. it is possible to reformulate a constraint as the main objective [6].
This chapter is focused on expected-utility DM under constraints that can be considered as a process that provides a solution for a decision maker with different objectives: To satisfy the constraints and rule out unfeasible solutions, and to maximize utility functions among the surviving options.
2.3 The Utility Function
For a quantitative stochastic DM case, the utility function provides a value that determines the quality of the solution that is being considered. It is formulated as an analytical expression derived from LDT to BDD conversion. Thresholds might set to be establishing in order to determine the solutions that are most suitable for the objectives of the decision maker.
2.4 Results
Once a decision is made, then it is necessary to choose the best alternative or scenario considering variables as feasibility and costs for implementing the decisions. None of DM processes is completely reliable due to the possibility to take into account the total range of events involved in the solution, and also in an a posteriori evaluation of consequences. Particularly, evaluation of consequences is essential for improving those DM processes with data from forecasting studies. The results derived from a decision can affect the complex structure of the problem, or to modify some features of constraints and requirements. Feedback is necessary in order to determine the quality of the decision because the decision maker requires knowing if the system responds as expected. Moreover, there are some decisions that need to be done periodically and a feedback is needed in order to improve the new decision quality according to the previous decision.
3 Logical Decision Trees
DM process is carried out when a certain problem occurs, with the objective of discerning whether there is a real problem [7]. Logical Decision Trees (LDT) describes graphically the roots and causes of a certain problem and their interrelation. The logical operators ‘AND’ and ‘OR’ are employed in order to connect the events considered [8].
Figure 3 shows a LDT composed of seven non-basic causes and nine basic causes. BCs are those causes that are not possible to be broken down into simpler causes. All these causes are linked by logical gates, in particular by 1 ‘OR’ gates and 3 ‘AND’ gates. LDT provides information about the critical states of BCs and how MP is usually generated. Figure 3 shows that BC7 is one of the most important causes, i.e. if BC7 occurs then MP will occur.
When LTD is larger, e.g. composed of hundreds or thousands of BCs, there are also larger alternatives for a direct decision tree analysis. A direct analysis can be done by a LDT to BDD conversion. An “if-then-else” conversion approach is described in reference [9] in order to carry out this conversion.
4 Binary Decision Diagrams
BDDs have been successfully found in the constant search for an efficient way to simulate LDTs. BDDs were introduced by Lee [10], and further popularized by Akers [11], Moret [12] and Bryant [13]. The BDD is used in order to analyze the LDT. They are composed by a data structure that represents the Boolean functions. They provide a mathematical approach to the problem by Boolean algebra, such as Karnaugh maps or truth tables, being less complex than its truth table.
BDD is a directed graph representation of a Boolean function where equivalent Boolean sub-expressions are uniquely represented [14]. A directed acyclic graph is a directed graph, i.e. to each vertex v, there is no possible directed path that starts and finishes in v. It is composed of some interconnected nodes in a way that each node has two vertices. Each vertex is possible to be a terminal or non-terminal vertex. BDD is a graph-based data structure whereby the occurrence probability of a certain problem in a DM is possible to be achieved. Each single variable has two branches: 0-branch corresponds to the cases where the variable is 0 and it is graphically represented by a dashed line (Fig. 7); on the other hand, 1-branch cases are those where the event is being carried out and corresponds to the variable with a value of 1, and it is represented by a solid line (Fig. 7).
It will allow obtaining an analytical expression depending on the occurrence probability and the logical structure of the tree of every single basic cause. Paths starting from the top BC to a terminal one provide a certain state in which MP will occur. These paths are named cut-sets (CS).
4.1 Ranking of the Basic Causes
The size of the BDD, as well as CPU runtime, have an awfully dependence on the variable ordering. Different ranking methods can be used in order to reduce the number of cut-set, and consequently, to reduce the CPU runtime. It must be emphasized that there is not any method that provide the minimum size of BDD in all cases [22]. The main methods are described in this section.
4.1.1 Top-Down-Left-Right (TDLR)
This method generates a ranking of the events by ordering them from the original FT structure in a top-down and then left-right manner [15]. The listing of the events is initialized, at each level, in a left to right path adding the basic events found in the ordering list. In case that any event had been considered previously and it was located higher up the then it is ignored (Fig. 4).
The ranking for the example showed in Fig. 3 using the TDLR method is:
4.1.2 Depth First Search (DFS)
This approach goes from top to down of a root and each sub-tree from left to right (see Fig. 5). This procedure is a non-recursive implementation and all freshly expanded nodes are added as last-input last-output process [16].
The ranking for the example presented in Fig. 3 is:
4.1.3 The Breath First Search (BFS)
This algorithm begins ordering all the basic events obtained expanding from the standpoint by the first-input first-output procedure (Fig. 6). The events not considered are added in a queue list named “open” that is recalled “closed” list when the all the events are studied [17].
The ranking for the example of Fig. 3 is:
4.1.4 Level Method
The level of any event is understood as the number of the gates that has higher up a tree until the top event. The “level” method creates the ranking of the events according to the level of them. In case that two or more events have the same level, the event will have higher priority if it appears early in the tree [15]. Table 1 shows the level of the events of the LDT showed in Fig. 3.
Therefore, according to the Table 1 and the Level method, the ranking obtained is:
It can be observed that in this case the ranking proposed by this method is the same that TDLR method, therefore, the CSs obtained will be the same as well.
4.1.5 AND Method
Xie et al. [18] suggest by the AND criterion that the importance of the basic event is based on the “and” gates that are between the k event and the top event, because in FTA the “and” gates imply that there are redundancies in the system. Consequently, basic events under an “and” gate can be considered less important because it is independent to other basic events occurring for the intermediate events [18]. Furthermore:
-
Basic events with the highest number of “and” gates will be ranked at the end.
-
In case of duplicated basic events, the event with less “and” gates has preference.
-
Basic events with the same number of “and” gates can be ranked as the TDLR method approach.
The ranking for the example showed in Fig. 3 using the AND method is:
4.2 BDD Conversions
Small BDDs for calculated the MP occurrence probability is possible to be done manually, but when larger LDTs have to be converted it is almost impossible.
Ite (If-Then-Else) conditional expression is employed in this research work as an approach for the BDD’s cornerstones, based on the approach presented in [19]. Figure 7 shows an example of an ite done in a BDD.
Which could be described as: “If A variable occur, Then f1, Else f2” [20]. The solid line always belongs to the ones as well as the dashed lines to the zeros, above explained.
Considering the Shannon’s theorem is obtained the following expression from Fig. 7.
where
Table 2 shows the different CSs obtained using the abovementioned ranking methods. A comparison between the CSs in done in order to analyze the efficiency of the methods ordering the basic causes showed in Fig. 3.
There is not a significant difference between these methods because the sizes of all the CSs are similar (Table 2). The main reason is because the LDT that is being analyzing does not have a large number of events. In Table 2 can be observed that the method that generates a lowest number of CSs is the AND method. Therefore it will be chose in this research work.
Figure 8 shows the BDD obtained from the LDT (Fig. 3) to BDD conversion using AND method.
4.3 Analytical Expression
The probability of occurrence must be assigned to each BC. P(BCi) is the probability of occurrence of the ith BC. \( P\left(\overline{BC}i\right) \) is the probability of non-occurrence of the ith BC. Therefore:
The probability of occurrence of jth CS (P(CSj)) can be calculated as the product of P(BCi) and \( P\left(\overline{BC}i\right) \) that compose the CS. Probability of occurrence of the MP (QMP) is given by:
where n is the total number of CSs.
5 Optimization Approach in DM Process
When the probability of occurrence of the MP is achieved, the new objective is to minimize it [21]. It is assumed that LDT will be fixed, and therefore the reduction of the QMP will be performed by the different BCs. Given a BC, the objective is to determinate the investment on it in order to reduce its probability of occurrence, considering the rest of BCs probabilities and the total investment, being the objective function to minimize the probability of occurrence of the top event QMP. A new variable that consider this reduction is defined, being:
The ith component of Imp(BC) provides the reduction of the probability of occurrence when some resources are assigned to the ith BC. In addition, a probability vector is defined as:
The ith component of P(BC) provides the probability of occurrence of the ith BC. Once the BCs have been improved, the new probability assignment is calculated as the difference between its probability of occurrence and its Imp:
The BDD evaluated using P(BC) provides the value of QMP. If it is being evaluated using, P*(BC), the data obtained will be termed as Q * MP . It would be desired that Q MP ≥ Q * MP , otherwise the optimization procedure is producing wrong outcomes.
The analytic expression provided by BDD will be an optimization function when is evaluated employing P*(BC). The optimization function will be calculated by Q MP (Imp).
BCs are not necessarily corrigible but almost always improvable, therefore Imp(BC) will range between 0 and a certain threshold. The first constrain is defined as:
0 ≤ Imp(BC i ) ≤ a i , where a indicates the maximum improvement that can be done in the ith BC. The a i values will be:
a i = P(BC i ) means that the ith BC is capable to be totally corrected due to this BC allows its own probability of occurrence to be zero (in this case BC i will not continue contributing to the MP occurrence). If a i = 0 then the improvements in the ith BC are not possible.
The improvement cost (IC) is defined for each BC in order to adequate a quantitative analysis to the nature of BCs, where a high IC for a BC means that a large amount of resources must be invested in order to reduce the probability of occurrence of BC.
IC(BC) = [IC(BC 1), IC(BC 2) … IC(BC i ) … IC(BC n )], where IC(BC i) indicates the amount of resources invested in BCi for reducing the probability of occurrence of BCi from 1 to 0.
A new variable is defined as the total amount of resources at the time of the investment operation (Bg), where:
The optimization problem is defined as:
This is a Non-Linear Programming Problem (NLPP) and NP-hard. The necessary conditions of optimality are defined by Karush-Khun-Tucker (KKT) conditions [4].
6 Case Study
A case study is presented in this section. The LDT showed in Fig. 3 is related to a real case study of a confidential firm. Table 3 details the initial conditions.
The probability of occurrence of the MP is 0.4825, calculated by the analytic expression obtained from the BDD in Fig. 8, and according to the probabilities of occurrence showed in Table 3. And the maximum investment (MI) would be:
Figure 9 shows the maximum investment for each BC.
If this maximum budget were available, then the minimum probabilities of occurrence (P min ) of the BCs would be:
And, consequently, the minimum QMP is:
Therefore, when the available budget is less than 560.5 monetary units (μm), it will be necessary to choose which BCs are the best to be improved in order to minimize the QMP.
The budget considered in the following example is 350 μm. For this purpose, the objective function, subjected to the constraints defined by budget and improvement limits, will be:
Figure 10 shows the optimal investment allocation subject to a budget of 350 μm in order to minimize QMP.
There are two BCs (Fig. 10) that are not improved due to the budget is limited, and there are one BCs where the maximum investment has not been completed due to lack of budget. The missing amounts to invest are:
Adding all these quantities a value of 250, that is precisely the difference between the maximum budget and the available budget, is obtained. Figure 11 shows the behavior of the QMP reduction according to the available budget. The QMP presents a non-linear behavior.
The slope is bigger at the beginning (Fig. 11), i.e. the investments are more useful until reach to a certain budget. Figure 12 shows the improvement of each investment compared to the previous one. The investment considered rises with a step of 50 monetary units.
The firsts 250 μm (Fig. 12) are more useful than the rest of the investment. This is a useful information when the availability of budget is limited.
7 Conclusions
Decision Making (DM) is a criteria selection method for choosing a good alternative. The diagram of a DM can be performed by logical decision trees (LTDs). This is achieved having a graphical representation of those situations that need to be improved. Employing binary decision diagrams (BDDs) can be reduced the computational cost for quantitative analysis. LDT describes graphically the roots of a certain problem and their interrelations.
The size of the BDDs depends of the variable ordering, and therefore the computational cost for the qualitatively analysis. TDLR, DFS, BFS Level and AND methods has been employed in order to find the optimal variable ordering.
A NP-hard and non-linear programming problem (NLPP) is considered in a real case study in this study. The necessary conditions of optimality are defined by Karush-Khun-Tucker (KKT) conditions. It has been found the optimal allocation when resources are limited.
References
Umm-e-Habiba, Asghar S (2009) A survey on multi-criteria decision making approaches. International conference on emerging technologies, Pakistan
Ekárt A, Németh SZ (2005) Stability analysis of tree structured decision functions. Eur J Oper Res 160:676–695
Baker D, Donald B, Hunter R, et al (2001) Guidebook to decision-making methods. Department of energy WSRC-IM-2002-00002. December 2001
Harris R (2013) Introduction to decision making. Virtualsalt. Retrieved from http://www.virtualsalt.com/crebook5.htm. 9 June 2012, 6 September 2013
Forrester JW (1993) System Dynamic and the Lessons of 35 years. A systems-based approach to policymaking. pp 199–240
Goertz G (2004) Constraints, compromises and decision making. Department of Political Science. University of Arizona. J Confl Resol 48(1):14–37
Huber GP (2007) Toma de decisiones en la gerencia, 2nd edn. Trillas, México
Lopez D, Van Slyke WJ (1977) Logic tree analysis for decision making. Omega, Int J Manag Sci 5(5):614–617
Pliego A (2012) Estudio cuantitativo y cualitativo de fallos en sistemas complejos. July 2012. Ciudad Real, Spain
Lee CY (1959) Representation of switching circuits by binary-decision programs. Bell Syst Tech J 38:985–999
Akers SB (1978) Binary decision diagrams. IEEE Trans Comput C-27(6):509–516, June 1978
Moret BME (1982) Decision trees and diagrams. Comput Surv 14:413–416
Bryant RE (1986) Graph-based algorithms for Boolean functions using a graphical representation. IEEE Trans Comput C-35(8):677–691
Masahiro F, Hisanori F, Bobuaki K (1988) Evaluation and improvements of Boolean comparison. Method based on binary decision diagrams. Fujitsu Laboratories Ltd
Malik S, Wang AR, Brayton RK, Vincentelli AS (1988) Logic verification using binary decision diagrams in logic synthesis environment. In: Proceedings of the IEEE international conference on computer aided design
Cormen TH, Leiserson CE, Rivest RL, Stein C (2001) Introduction to algorithms, 2nd edn. MIT Press and McGraw-Hill. Section 22.3: Depth-first search, pp 540–549. ISBN 0-262-03293-7
Jensen R, Veloso MM (2000) OBDD-based universal planning for synchronized agents in non-deterministic domains. J Artif Intel Res 13:189–226
Xie M, Tan KC, Goh KH, Huang XR (2000) Optimum prioritisation and resource allocation based on fault tree analysis. Int J Qual Reliab Manag 17(2):189–199
Artigao E (2009) Análisis de árboles de fallos mediante diagramas de decisión binarios. November 2009
Brace KS, Rudell RL, Bryant RE (1990) Efficient implementation of a BDD package. 27th ACM/IEEE design automation conference
Garcia F, Pliego A, Lorente J, Trapero J (2014) A new ranking approach for decision making in maintenance management. Proceedings of the seventh international conference on management science and engineering management. Lecture notes in electrical engineering, vol 241. pp. 27–38
Bartlett LM (2003) Progression of the binary decision diagram conversion methods. Proceedings of the 21st international system safety conference; August 4–8, 2003. Ottawa, Westin Hotel, pp 116–125
Acknowledgements
The work reported herewith has been financially supported by Spanish Ministerio de Economía y Competitividad, under Research Grant DPI2012-31579.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Marugán, A.P., Márquez, F.P.G. (2015). Decision Making Approach for Optimal Business Investments. In: García Márquez, F., Lev, B. (eds) Advanced Business Analytics. Springer, Cham. https://doi.org/10.1007/978-3-319-11415-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-11415-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11414-9
Online ISBN: 978-3-319-11415-6
eBook Packages: Business and EconomicsBusiness and Management (R0)