Welcome to 2015 at CBS - Welcome to a year with challenges
By Jan Molin, Dean of education
Translated by External Affairs, CBS
Having observed the challenging context that CBS is co-creator of, I have picked a general theme for the contributions I will be writing this year (as part of the Senior Management's 'blog' series). I will focus on describing a number of apparent paradoxes which can be observed in our common working life. Apparent, because paradoxes are built on assumptions that irreconcilable phenomena and contrasts are 'of the same kind' and on 'the same unique continuum'.
Take the 'quality-resource' debate in which it is often assumed that there is a direct relation between the allocation of resources and an observable effect on quality. Here the political demand for better quality with fewer resources has been considered a paradox to such a degree that it has been impossible to discuss in which areas it might in fact be appropriate to reduce resource consumption and view this as an opportunity to change current routines in order to improve quality. Of course, this does not mean that one can simultaneously improve quality and reduce resources in every place and every situation. Rather, it means that viewing the political demand as a paradox blocks any analysis of when and under which conditions it is possible to generate intelligent answers.
When times are hard and resources are scarce, it makes good sense to strengthen dialogue and work to avoid rash conclusions. Paradoxes are a form of everyday rationality. They are illustrations of a local common sense - and of course common sense is just a step away from social control. If we accept the idea that the coexistence of contrasts and irreconcilable entities is the rule rather than the exception, we have the opportunity to resolve many paradoxes. When paradoxes are resolved, energy is set free and the room for action is increased because the frozen observation of apparent contrasts seems to paralyse and block initiative (damned if you do – damned if you don’t).
At CBS we need, more than ever, to strengthen and advance our particular capacity to break with common sense. The years leading up to 2015 have held plenty of challenges and have demonstrated surprising examples of original and unorthodox answers. Here, CBS has a 'muscle' that should be trained and strengthened: It is about the will to use structural limitations as opportunities to invent intelligent solutions - and it is about the inclination to take a risk: because we all know that no matter how frozen our situation appears from our analyses, we can always choose to do something different.
I am not posing as a naive optimist, rather, I am trying to describe the necessity of keeping the constructive and resourceful basis that has brought CBS this far. As Stanford professor Geoffrey Pfeffer wrote some years ago: ”doing something requires doing something”
Paradox #1: control is a precondition for trust
I will start with a short detour which will hopefully create an illustrative background for the following line of reasoning:
Just over fifty years ago Stanley Milgram conducted an epochal psychological experiment at Yale University. He demonstrated that ordinary people were willing to subject another person to shocks of up to 450 volt if a man in a white coat told them to do so.
These experiments were taken to reflect the theory that:
Humans are prepared to commit inhuman acts
if an authority 'orders' them to do so.
At about the same time, more than twenty years of studies were concluded; the famous 'Hawthorne experiments' in which a randomly constructed group of female factory workers allegedly increased productivity as a reaction not only to positive but also to negative changes in their working conditions.
These experiments were taken to reflect the theory that:
Humans are prepared to commit irrational acts
if the management can create the necessary motivation.
These two historic experiments show interesting parallels to the ongoing debate on the main and subsidiary characters of the financial crisis. For the past few years we have witnessed a grand scale social psychology experiment: People in the financial sector have appeared ready to dispense with their common judgement if the prospective gain was big enough - and if their decisions and actions were not sufficiently checked.
The financial crisis as a social psychology experiment has been taken to reflect the theory that:
Humans are prepared to act from greed and narrow self-interest
if there is an absence of effective control systems.
As Milgram's experiment has been taken to reflect the authoritarian personality
As Hawthorne's experiments have taken to reflect the effect of extrinsic motivation
As many have advocated that the financial crisis reflects greed to be a human trait
All three interpretations express an assumption that
People are generally weak, controllable and victims of their own or other people's questionable motives. The underlying view of human nature is essentially depressing and sceptical, it conveys mistrust in people's common judgement.
Based on a more optimistic and altruistic view of human nature, the three examples can be taken to reflect a very different basic assumption that people base their actions on trust.
When Milgram's test subjects are ready to impose electrical shocks
They do so because they trust the expert
When the factory workers increase their productivity even when working conditions deteriorate
They do so because they have established trust in each other and the 'subculture' they have created amongst themselves
When managers and employees in the financial sector have been tempted to
grant loans and credits beyond common sense
They do so because they trust the system and the people they deal with
In all three cases the data holds plenty of room for trust to play a possible part...
But strangely the predominant reading is based on mistrust of the individual and biased focus on people's weaknesses and darker side.
When mistrust sets the agenda it is often a signal that authoritarian forces are afraid to lose control. When control takes over it prepares the ground for casting suspicion:
Let us take a specific example:
An unpleasant case arises at a random Danish institution when a researcher uses research funds to pay for private dinners and travels on a relatively large scale.
The case comes to the public's attention at a time when an administrative employee feels so unfairly treated by the researcher in question, that a reporter receives copies of records that show the current irresponsible practice with research funds.
The institutional logic immediately presents itself.
A clear and unambiguous policy for the appropriate administration and use of research funds must be prepared.
Standards, systems and controls must be put in place to ensure that everyone complies with this new policy.
The management demonstrates efficiency and satisfies potential external criticism by tightening internal controls. The technocracy hereby demonstrates its unique capacity for creating external trust and legitimacy through systems that create internal suspicion and mistrust.
When we want to manage and control in connection with these big 'cases' the fear of the individual extreme deviant sadly often becomes a legitimisation of a massive suspicion cast on a lot of honest and decent people.
Thus also in the above example when one year on we see that, a couple of cases aside, there has been no other examples of actual misuse. That is all to the good. However, the problem is that the systems, procedures and controls introduced are here to stay (despite the fact that this was one or very few isolated cases in the entire country).
In the book 'Black Swan' Nassim Taleb writes about great unforeseen disasters with enormous impacts. The reasoning is about how great unforeseen disasters seem to be followed by processes in which people try to find relatively simple explanations as to what caused the crisis and not least create knowledge of and suggestions to initiatives that in future will be able to avert new similar disasters and which, it is implied, would have made the disaster foreseeable (if only... it would not...)
As the disaster thus should have been foreseen, there can only be two reasons why it went so wrong:
Either the systems are not accurate enough - or it is a 'human error'
In both cases the solution is simple: more control.
Management is always built on an underlying (usually unspoken) view of human nature and the above examples express my attempt to illustrate how such a view co-creates our ideas of why phenomena emerge and how we should respond to them. If we deal with our colleagues based on the assumption that they are labile and necessarily must be controlled that obviously opens a different toolbox than if we view them as trusting and decent people. When mistrust is organised the need for control arises. Control is the expressed suspicion cast on the majority in protection against the individual deviant.
Thus management projects its own weaknesses onto the employees (and vice versa).
He who does not feel in complete control of himself experiences an increasing need to control others - or to put it differently...
I will lose respect and authority if I am seen to not have a handle on my responsibilities,
therefore I need to ensure that mistakes will not be made due to my lack of control of my employees/colleagues.
In a small survey on trust in the workplace at MMD (Master of Management Development) in the autumn of 2014 a clear majority of managers replied that they to a wide extent trusted their colleagues and employees. They also replied that they did not believe that their colleagues and employees trusted others to the same extent. The answers can thus be interpreted as the respondent managers trusting others, except that they mistrusted the trust of these others. The paradox is obvious.
Therefore companies and institutions in the name of trust (to ensure that we in future can trust the system) introduce control systems although we know that the more control we introduce, the greater the probability that mistakes will be made. The more entities that control the same conditions, the more superficial the quality of the control of the individual entity - because everyone relies on the certainty that there are many others who can help catch the error. When everyone thus trusts that someone else will catch the error, it is inevitable that isolated serious errors will happen, which in accordance with the prevailing logic (mistrust) leads to the introduction of further procedures and controls.
Thus trust in the most extraordinary way leads to mistrust and to control.
If we 'resolve' the paradox and free ourselves from the idea of the necessary control we find resources to invest in activities that will strengthen CBS' cohesion and advancement. It might even help strengthen the internal trust in each other and the future. Resolving a paradox is about taking responsibility and taking a risk that something unforeseen could happen. If we work with the paradoxes on trust we must abandon the illusion of control and the dream about 'order'. We create trust by showing trust. Control creates nothing.
That poses the natural question: "who is the we mentioned above?".
It is us. You and I and the two of us. It is colleagues, employees and managers in all our different capacities and positions. Creating community and developing a stronger trust-based culture cannot be orchestrated by a narrow management group or an active minority. It takes movement.
And the word move has at least three connotations:
Move can be 'the act of changing position'...
Moving can be 'to stir emotions' - and
Movement can be 'an organisation of people'.
Happy New Year!