TAJnet - TA journal
ISSN: 1524-0029
Transactional Analysis Journal

TA Foundation TheoryTA ArticlesReviews
Vol 2, 1999   Science and TA  
Jan 24, 1999 by Chris Boyd

Vol. 4, 2001

Vol. 3, 2000

Vol. 2, 1999

Vol 1, 1998

 

 

  ITAA has consistently called for scientific research to confirm TA theories. We know that science persistently redefines itself by experimenting, repeating experiments, reverifying results or proving results to be at least partially incorrect under certain conditions. Similarly, Script Analysis explores and tests assumptions upon which a life has been built, showing some to be valid and some to be at least partly inaccurate under current conditions. Perhaps in this way TA has a very science-like feel to it. Perhaps because TA parallels the scientific process, we assume that it is amenable to the scientific method. Is this assumption valid?

Scientific research seeks to supply scientific evidence that a particular hypothesis is valid. In the social sciences this usually means research using statistical methods. From a well defined population researchers choose random samples. Statistics allows results derived from the random sample to be applied to the population. The trade off from sampling rather than using the entire population is that there is a degree of certainty attached to the conclusions that is less than 100%.

Some of the questions I may ask myself to test the validity of such research are:

  1. Is statistical induction the appropriate mathematical tool for the type of data?
  2. Is the appropriate statistical model being used?
  3. Is the sample truly random? If not, the results are not generalizable to the population.
  4. Is the sample large enough? As the population size decreases so do the chances for accurate hypothesis testing.
  5. Are the results overgeneralized to a larger population than was sampled?
  6. If the study demonstrates a positive correlation between variables, have the researchers implied unnecessarily that one caused the other?
  7. Is the test repeatable with the same methodology yielding the same results?
  8. Was there a double blind study to eliminate placebo effect?
  9. Is the research suggesting anything about individuals? One of the limitations of statistics is that the numbers are taken from groups and can only be applied to groups. It says nothing about individuals.

There are two ways of knowing and creating according to Berne. Along with the Adult reality testing functions which might include statistical induction, he held intuition in high esteem:

"One might even go so far as to agree that in everyday life we learn more, and more truly, through intuition than we do through verbalized observations and logic. We are tempted to be proud of verbalizations, but it is possible that in many of our most important judgments the small and fragile voice of intuition is a more reliable guide.... Verbal processes are additive, while intuitive processes are integrative--It appears that the most important judgments which human beings make concerning each other are the products of preverbal processes--cognition without insight-- which function almost automatically below the level of consciousness...that there is a time for scientific method and a time for intuition--the one brings with it more certainty, the other offers more possibilities; the two together are the only basis for creative thinking."(Berne, 1977)

TA theory is largely developed from this second way of knowing, that is, from Little Professor insights into the human condition. Lacking mathematical rigor, these insights can be called heuristics or rules of thumb. These heuristics became the foundation of an organizational Script much as Little Professor judgments become the foundation of a human Script. Script Analysis suggests we must investigate our decisions made with the Little Professor and use current Adult information to update those assumptions which influence our Script.

In doing a TA Script Analysis we will meet some of the same types of resistances in that we have based our collective life on a certain set of assumptions. Those assumptions have allowed us to survive even though they may be incorrect. Many therapists base their livelihoods on these assumptions and any questioning of them under the scrutiny of rigorous scientific methodology may show them invalid to some degree. Perhaps we are too financially invested in a theory to keep an open mind to the results of unbiased research. Yet perhaps these fears have no foundation. For we know that good research limits the scope of study to narrowly defined parameters, so that it is only possible to confirm or deny well defined parts of TA and not TA in its entirety. This limitation of scope will inhibit us from being able to prove or disprove TA theory in general.

We must question our organizational Parental injunctions to be internally consistent with our theory. Many of the arguments for TA are currently Ad Hominem(lit. to the man) arguments, that is, the particular theory is correct because an individual therapist or group of therapists held in high regard said so, much as Script messages may be accepted on the authority of parents. Currently, we do not have well defined experiments yielding results that can be replicated.

If we choose the scientific approach all is fair game, even some of the basic foundational theories. For example, we may question how is it that our theory is based on the research of Wilder Penfield?(Berne, 1961). Some have questioned the generalizability of his results because he used only epileptic patients for his test sample. If we choose to ignore these contraindications to his theory, why?

Assuming that it is possible to base a theory on brain research, how did we choose Penfield above others and why? Are we selectively sampling neurological researchers in order to bias our results in our favor? Do we ignore research done after Berne's life time? Is the only argument in favor of Penfield Ad Hominem because Berne said it? If so, why? If not, why not?

How did we exclude other theories of brain function especially post-Bernian theories? Michael Gazzaniga's research with split brain patients seems to demonstrate the existence on an unconscious and might be seen to support Freudian theory (Gazzaniga, 1985). How does this mesh with Penfield's research, and if not, why not? Paul Maclean's theory of the triune brain composed of the neomammalian, paleomammalian, and reptilian brains each with their own individual structure and function might seem to support both TA theory of three ego states and Freudian theory of superego, ego, and id (MacClean, 1973). Karl Pribram's work at Stanford suggesting the holographic functioning of the optic nerves would seem to support more wholistic models such as Gestalt theory (Pribram, 1982). Robert Ornstein's work at the University of California Medical Center with split brain patients seems to suggest classifying functions of the brain into left and right functions of cognition, one being more logical and one being more intuitive and holistic (Ornstein,1986). How do we incorporate this knowledge?

Do we selectively ignore other models from artificial intelligence such as Zadeh's Fuzzy Logic? This is a logic used to model perception and used in newly designed "smart" cameras. Where standard logic must give a true or false value to every proposition, fuzzy logic assigns a certainty value between zero and one to each of the propositions, so that we say a statement is .7 true and .3 false. Is this theory selectively ignored to support our theories?

Do we selectively ignore Hebb's theory of associative networks (Hebb, 1980) modeling the interconnectedness of various systems of the brain? Hebb's work provides a background for those working with neural networks in artificial intelligence. These networks are so complex that they seem to fall under the mathematical umbrella of the new science of complexity, the modeling of complex systems including living systems( Waldrop, 1992). These complex systems are unpredictable in the short term and are not amenable to statistical analysis. This suggests the type of data in the brain may be of a type that cannot be analyzed accurately using statistical analysis. We may be running into a mathematical wall beyond which our intelligence cannot go. We may find ourselves permanently in the position of Freud who gave up neurological research in order to pursue a theory that would substitute temporarily until researchers could acquire the tools to know brain functions more precisely. These current theories suggest that those tools Freud imagined at some future date may never exist. If so, it is better reality testing to admit this position rather than let wishful thinking become our collective reality. It may be that we may not be able to found any comprehensive psychological theory on brain research. We must be open to this possibility to maintain a scientific approach.

If we are open to discrediting some parts of existing TA theory, we allow ourselves to remain unbiased in our research efforts and are better able to gather random samples for our research. Otherwise we may bias our samples to lend credence to our biases and remain unscientific.

From a broader perspective, why should we limit our research to statistical analysis? In questioning our basic assumptions, we must also question our assumptions around epistemology, that is, how it is that human beings know. I see three primary ways of knowing: the Adult functions of logical deduction and statistical induction, and Little Professor function of intuition. By choosing statistical induction as the exclusively valid way of confirming our theories, we are excluding the two other primary ways of knowing.

How do we exclude deduction? Albert Einstein is held by many as a model scientist. He did not use statistical induction for his research. He used deduction, deducing many of his theories through pure mathematics. It was only later that many of his theories were tested to be accurate, whereas in statistical induction the experiments precede the conclusions.

If deduction is another way of knowing in TA, where and when is it applicable? Berne did use Venn diagrams, circles either distinct or overlapping, which were borrowed directly from symbolic logic to visually describe transactions. Perhaps TA in part can be considered a theory that analyzes ones own deductions based on childhood primitive assumptions. Perhaps it focuses on how people become irrational in decision making. In this case, TA provides critical thinking skills for human relations and can be considered a basis for analyzing the accuracy of our reality testing. Much as mathematics provides the language for science, TA may provide the logic for human relations and can be at least in part a deductive language.

For example, in logic we can analyze an argument saying that it is Ad Hominem because it criticizes the source of the argument rather than the reasoning process of the argument itself. In TA we can accept an argument by playing the Game of Psychiatry with the premise, "You are healer," supported by various degrees and certifications, so that your argument is accurate and valid because of who you are rather than being accurate and valid because of your clear thinking. Or, if we overtly praise a theory while covertly regarding it with skepticism we are playing Gee Your Wonderful. If TA is in part a language of deduction, then it is beneficial to state it as such to more clearly define the theory, rather than throwing TA research into a Procrustean bed of statistical analysis.

Little Professor knowledge is based on experience where conclusions are arrived at through intuitions developed from perceptions rather than through rigorous statistical induction. Perhaps some theories of human relations are only knowable through intuition. If so, which ones and why? And how do we differentiate these theories from metaphysics?

Each way of knowing has its limitations. Statistical knowledge is so specific that it is limited in scope. Very tiny bits of knowledge are acquired with a great degree of certainty. One can't see the forest for the trees. Intuitive knowledge achieves broader scope with lesser degree of certainty. One sometimes cannot see the trees for the forest. Deductive knowledge achieves logical certainty lacking the experimental backing.

In our choice to research TA, it seems we have assumed the statistical approach as the only valid method. Perhaps psychology cannot be grasped with statistical knowledge alone. Perhaps we need the broader perspective of the Little Professor and some deductive reasoning to complement pure research in order to achieve a broader and more accurate perspective of reality. Perhaps trying to please others in the psychological community who would hold statistical induction as the only valid way of knowing is to buy into their Script accepting an assumption that may not be accurate. And to live with this assumption without questioning it is to diverge from reality by overadapting to a collective Don't think injunction.

In order to generate a more scientific approach to TA, we must do a Script Analysis of TA. We must model our own theory by applying it directly to ourselves as a group. What are our basic assumptions both with regard to TA and with regard to epistemology? Which are currently valid and which are not? And we must be open to the fact that some of our collective injunctions may be currently invalid.

Hagar the Horrible, nationally syndicated cartoon character is standing on a mound above the throngs, sword held high, declaring loudly for all to hear, "Ours is not to wonder why, ours is but to do or die." One soldier in the crowd raises his hand and asks, "Hagar, can I ask just one question?" Hagar replies, "Sure!", to which the soldier replies, "Why?"

References

Berne, E.(1961)Transactional Analysis in Psychotherapy. New York: Ballantine Books.

Berne, E.(1977) Intuition and Ego States(Edited by Paul McCormick). San Francisco: TA Press.

Gazzaniga, M (1985) Social Brain: Discovering the Networks of the Mind. New York: Basic Books.

Hebb, D.O., (1980) Essays on Mind. New Jersey: Lawrence Erlbaum Assoc, Inc.

MacClean, P. (1973) A Triune Concept of the Brain and Behavior. Toronto: University of Toronto Press.

Ornstein, R (1986) The Psychology of Consciousness. New York: Penguin.

Pribram, K. (1982) Languages of the Brain. New York: Bronx Brandon House, Inc.

Waldrop, M. M. (1992) Complexity: The Emerging Science at the Edge of Order and Chaos. New York: Touchstone.

Copyright © Chris Boyd, all rights reserved.


About the Author

Chris Boyd, Ph. D., is past-President of the Eric Berne Seminar, 1982-1994.

 

[ HOME | TA foundation theory | Articles | Reviews ]
[ What's new | Related links | Related research |
Online discussion | From editor Letters to editor | ]
[
Search | Mailing list | Mission |Author info  ]

TAJnet is dedicated to publishing scientific articles related to the theory and practice of transactional analysis.
Published by the
International Transactional Analysis Association.
ISSN: 1524-0029
Copyright © ITAA, all rights reserved.
rss
Карта