Pages

Showing posts with label philosophy. Show all posts
Showing posts with label philosophy. Show all posts

Wednesday, 9 July 2025

PISO and CIMO frameworks

Have you heard of PISO and CIMO before? Well, if not, they stand for population, intervention, study design, outcome; and context, intervention, mechanism, outcome (Cochrane Library, 2025; Costa et al., 2018).

PISO (Cochrane Library, 2025) is:

  • Population (or Patient or Problem): "What are the characteristics of the patient or population (demographics, risk factors, pre-existing conditions, etc)? What is the condition or disease of interest?"
  • Intervention: "What is the intervention under consideration for this patient or population?" So what are we going to do, treat, change, or action?
  • Study design: "What is the alternative to the intervention (e.g. placebo, different drug, surgery)?" What else will we consider, and how will we plan this?
  • Outcome: "What are the included outcomes (e.g. quality of life, change in clinical status, morbidity, adverse effects, complications)?" What do we expect to happen, what would we like to happen? How will we measure and know if this has worked?

CIMO (Costa et al., 2018, p. 3) is:

  • Context: "The results that human actors aim to achieve and the surrounding (external and internal environment) factors that influence the actors". What are the circumstances or environment where we research the intervention?
  • Intervention: "Purposeful actions or measures (products, processes, services or activities) that are formulated by the designer or design team to solve a design problem or need, and to influence outcomes". What is the action or change we introduced into the situation?
  • Mechanism: "The mechanism that is triggered by the intervention, in a certain context, by indicating why the intervention produces a certain outcome. It can be an explanation of the cognitive processes (reasoning) that actors use to choose their response to the intervention and their ability (resources) to put the intervention into practice". How will/might the intervention work? Quantitative research will be in order to produce the outcomes in the next step; qualitative will be more "how might the intervention work?" and being open on outcomes.
  • Outcome: "Result of the interventions in its various aspects". What did we ended up with; what were the impacts of the intervention?

The key differences between these concepts is that PISO is more likely to be used in clinical or experimental research designs, and often in healthcare. PISO tends to emphasise who is being studied and how (Cochrane Library, 2025). On the other hand, CIMO is a management and social science tool, seeking to understand why and how interventions work in specific contexts or cases (Costa et al., 2018). Either framework will assist in systematic reviews as well as evidence-based research projects. 

Following either a PISO or CIMO framework assists researchers in how to ask their research question (or questions), what type of methodologies, methods and data collection should be chosen, determining variables, and analysing and organising findings.

Anything that helps us to create stronger, more deliberate ways of researching has to be a bonus!


Sam

References:

Cochrane Library. (2025). What is PICO?. https://www.cochranelibrary.com/about-pico

Costa, E., Soares, A. L., & de Sousa, J. P. (2018). Exploring the CIMO-logic in the design of collaborative networks mediated by digital platforms [paper]. Collaborative Networks of Cognitive Systems (19th IFIP WG 5.5 Working Conference on Virtual Enterprises), PRO-VE 2018, Cardiff, UK, September 17-19. https://doi.org/10.1007/978-3-319-99127-6_23

read more "PISO and CIMO frameworks"

Wednesday, 24 May 2023

An academic research metaphor

Have you ever thought about the metaphors used to describe academic research? I ran across a great one last year which I have been considering for some time:

"Imagine higher education research as an open, grassy field where objects of various shapes and materials scatter in isolation or in groups. Each object or group of objects is being examined by a group of researchers who use different tools. These objects can be freely assembled to create new objects; a researcher can move from one object to another, using their own tools, borrowing from other groups, or devising new ones by combining different tools. A tool is often ready-made and brought in this field from its original disciplinary factory, whether it is political science, sociology, psychology, or history. Each tool can be adjusted to fit the examination of objects in the field" (Le, 2022, p. 5).

For me, this imagery conjures a vast fecund rolling meadow in summer. I picture it full of tall grass and flowers whispering past us as we roam to each successive grove of glorious trees, each an arbour containing flora and fauna of a particular place and time. We gather elements from various groves, and return to our own grove with our collection, to examine under the dappled light of our grounding school of thought. We consider the tools we have collected for fit within our home environment, determining whether and how we can adapt this 'new' tool to suit our particular grove.

While Le considers the chosen "tool" - singular - to be "the most important thing" needed for researchers to be effective (Le, 2022, p. 5), I take a pluralised approach to 'tools', which Le indirectly alludes to: that the "objects can be freely assembled to create new objects" (p. 5, emphasis added to plurals). 

Providing we remember that using such a range of "topics and conceptual tools [may require an] epistemic leap to cross the disciplinary boundaries and connect with other researchers" (Le, 2022, p. 5), our individual research ontologies, epistemologies and axiologies can be carefully curated from both within our own grove, and from without. The collective fit of a more eclectic selection is all about the care and thought which we apply - and clearly write up - to guide those who later follow our published and peer-reviewed work.

After all, research should be “a systematic, careful inquiry or examination to discover new information or relationships and to expand/verify existing knowledge for some specified purpose” (Bennett, 1991, p. 68).

Let's be careful out there. And thoughtful. And try new things. 


Sam

References:

Bennett, R. (1991). Chapter 5: What is Management Research? in N. Smith & P. Dainty (Eds.), The Management Research Handbook (pp. 67-78). Routledge.

Le, P. A. T. (2022). The academic profession from the perspectives of aspiring academics. [Doctoral thesis: University of Melbourne]. https://rest.neptune-prod.its.unimelb.edu.au/server/api/core/bitstreams/3981b6f3-8c08-4608-a4e4-54dbce1b96a6/content

read more "An academic research metaphor"

Wednesday, 31 August 2022

What is my philosophy

So are our pathways all pre-determined from birth? That is an interesting question, and the answer is: it depends on our personal philosophy! Which then requires us to take a step back and consider: what is a personal philosophy? Philosophy has been defined as:

"(In the original and widest sense.) The love, study, or pursuit of wisdom, or of knowledge of things and their causes, whether theoretical or practical" (Simpson & Weiner, 1989, p. 688)

While there is a flavour of that in examining the idea of predeterminism, it seems more likely that this will be the fifth meaning, that of:

"(= metaphysical philosophy.) That department of knowledge or study which deals with ultimate reality, or with the most general causes and principles of things. (Now the most usual sense.)" (Simpson & Weiner, 1989, p. 688). This usage dates from 1794, where one "J. Hutton Philos. Light, etc. [said on page] 121 Now, philosophy is that general knowledge by which the works of nature are understood in seeing the wisdom of design" (p. 688).

So how do we know what our personal philosophy is? For those who have studied this area, this is a reasonably easy question to answer, but the rest of us are probably in the dark about all the 'isms'. Stumbling upon this infographic from Vital (2020), I found it very briefly sums up many potential choices we may have:

In considering our pathway choices through the viewpoint of fatalism or determinism, the answer is a clear "yes", all things are pre-determined from birth. But if we take a subjectivist viewpoint, that confidence becomes a 'maybe'. If we pop on the hat of existentialism or absurdism, we get a 'no': everything is either down to us ...or what does it matter anyway?

It really does all depend.


Sam

References:

Simpson, J. A., & Weiner, E. S. C. (Ed.) (1989). Oxford English Dictionary (2nd ed., Vol XI Ow-Poisant). Clarendon Press.

Vital, A. (20 November 2020). The Meaning of Life According to Different Philosophies [infographic]. Adioma. https://blog.adioma.com/meaning-of-life-according-to-philosophy/

read more "What is my philosophy"

Wednesday, 15 September 2021

Moving from what to why

In career practice there is a tricky balance between task and relationship.

As career practitioners, getting stuck into "Doin' the do" (Clarkson, 1990), can be sooo seductive. It is easy, we can see an immediate gain, and we feel we have been efficient . A bit like a doctor prescribing for a patient, we got a result for the client and they have left the office with something concrete.

We want information. We want delivery of a thing. The transaction is routine. It is all in the now. Things may be 'done to' the client. The client can be passive in the process, and the career practitioner does the doing. It is the "what" of career practice.

However, I have come to realise over the years that the real mahi in career practice is in digging into the "why" for my client. It is harder, there is usually nothing to see immediately, but it is sticky. If we get the grounding - the turangawaewae - right for the client, then everything else "runs, like a river to the sea" (1987).

This is mahi which changes both of us as actors in the process. It is centered on the client. It is about relationships. It is visionary, but visionary in the hands of the client: the service is not 'done to' them, but is done BY them. It is future-focused. The career practitioner waits to see where the client will lead.

The expert 'deliverer' approach to career practice is sometimes an expectation for secondary school students, but, "[t]he older the young adults are, especially [women], the more their ideas about counselling match with the method of client-centred counselling" (Paszkowska-Rogacz, 2008, p. 122).

So not only do counsellors become more relationship-focused as we age, but so too do our clients.

Growing together :-)


Sam

References:

  • Clarkson, A. M. (1990). Doin' the Do [Recorded by Betty Boo]. On Boomania [CD]. UK: Warner Brothers.
  • Paszkowska-Rogacz, A. (2008). Predictors of client expectations from career counselling. Ergonomia: An International Journal of Ergonomics and Human Factors, 30(2), 119-133. https://d1wqtxts1xzle7.cloudfront.net/47035780/Predictors_of_client_expectations_from_career_counselling.pdf
  • U2 & Bono (1987). One Tree Hill [Recorded by U2]. On The Joshua Tree [LP]. Dublin: Island Records.

read more "Moving from what to why"

Monday, 15 March 2021

A simple view of method

There is some research that, when you read it, sounds incredibly well-planned. But am also wondering if we get into the habit of making our methods sound good at the end; as opposed to actually being good from the outset.

I suspect that we tend to make methodology and methods too complicated - arcane, even - in academia. All the esoteric categories and sub-categories sound so deliberate. However, the more I read, the less sure of the 'deliberateness' I become.

Recently I ran across these powerful words:

"I begin to see that the whole idea of a method for discovering things is ex post facto. You succeed in doing something, or you do something so well that you yourself want to know how you did it. So you go back, trying to re-create the steps that led you, not quite by accident, not quite by design, to where you wanted to be. You call that recreation your ‘method’. (Koller, 1983: 88 [...])" (Thorne, 2016, p. 19).

Wow. Well, it looks like other people also think we 'make it up' as we go along in research. Even when we write our methodology up when we go to publish our work, we tend to follow a "drunkard's walk" (Heinlein, 1980, p. 164) approach. A drunkard's walk can be defined as:

"a mathematicians' term for a two-dimensional random search. The name comes from the colorful image of a drunk standing in the dark between two lamp posts. The drunk wants to get to a lamppost — he doesn't care which — but he's so intoxicated that he can't control which direction he's stepping in; all he can control is that he is walking toward a light. Every step he takes is a 50/50 split between going one way and the other. Eventually he will reach a light, but how long it'll take him is the big question" (Schroeck, 2012).

The more expert we become and the more experience we accumulate, the fewer our elements of drunkard's walk will be. But is that because we have learned more about our 'chosen' method, or is it that we have learned what category our natural method inclination is most aligned with?

Or some of both? And does it matter?


Sam

References:

  • Heinlein, R. A. H. (1980). The Number of the Beast. New English Library.
  • Koller, A. (1983). An unknown woman: A journey to self-discovery. Bantam.
  • Schroeck, R. (2012). Latest Update: 29 November 2012. http://www.accessdenied-rms.net/dw2conc.shtml
  • Thorne, S. (2016). Interpretive Description: Qualitative Research for Applied Practice (2nd ed.). Routledge.

read more "A simple view of method"

Wednesday, 9 December 2020

Characterising approaches to argument

It is generally considered that there are three types of academic argument: classical - aka Western, or Aristotelian - argument (Excelsior Online Writing Lab, 2020; Macauley, 2020); Toulmin argument (1958; read more here); and Rogerian argument (aka 'persuasion'; Nordquist, 2019).

However, across these three types, there are also a few characterising approaches to argument. Those are considered to be structural; pragmatic; and the cluster of inductive, deductive and conductive (McKeon, 2020), as follows:
  1. Structural: this is the "if this, then this" type of argument. This type of argument can be displayed in standard form: Premiss 1, Premiss 2, Premiss 3, Therefore: Conclusion (p1, P2, P3, C). This is used a lot in science, often paired in hypothesis testing research. There is no 'why': there is only evidence.
  2. Pragmatic: this is where a 'reasoner' proposes several premisses as supporting reasons, and explanations of 'why', "to rationally persuade an audience of the truth of the conclusion" (McKeon, 2020; Pierce, 1908). We gain an understanding of other perspectives, but also have to be alert to the reasoner's aims in case they differ to our own, or are contrary to reality.
  3. Inductive/deductive/conductive:
    1. Deductive: basically, if the premisses are true, then the reasoner's argument should be valid; a step by step, known outcome model. McKeon provides an example: "It’s sunny in Singapore. If it’s sunny in Singapore, then he won’t be carrying an umbrella. So, he won’t be carrying an umbrella" (2020). We talk a lot about validity here. This is a mathematical argument (even though this is often called 'mathematical induction'). We can see how this fits with structural argument.
    2. Inductive: this is where, if the reasoner's argument is strong enough, then the argument is likely to succeed. You can hear the probabilities whirring in this one! As McKeon illustrates, "For example, this is a reasonably strong inductive argument: Today, John said he likes Romona. So, John likes Romona today. (but its strength is changed radically when we add this premise:) John told Felipé today that he didn’t really like Romona" (2020). Ouch for Romona. We talk about reasoning here. This is a humanistic, social sciences, management-style of argument. We can see how this fits with pragmatic argument.
    3. Conductive: the reasoner provides "explicit reasons for and against a conclusion, and requiring the evaluator of the argument to weigh these competing considerations, that is, to consider the pros and cons" (McKeon, 2020). Provide all the arguments and let the audience decide for themselves. We can see how this too fits with pragmatic argument.
And then I read a great piece by Patters on retroductive argument (Thompson, 1999), which is also known as abductive argument. This is where "an explanation is proposed to account for an observed fact or group of facts, [...] i.e. any type of similarity or co-occurrence, including (but not limited to) location in space and time. For example, 'Jones was in the building at the time of the murder. Perhaps he is the killer,' or 'The blood on the victim's shirt matches Jones' blood type. Perhaps Jones is the killer.' In the second example, the similarity of blood type is the concomitance on which the inference turns" (Thompson, 1999). Lovely.

So retroduction - aka abduction - is where we take an "observation or set of observations and then seek[...] to find the simplest and most likely conclusion from" them (Leslie & Van Otten, 2020): an Occam's Razor approach, if you will (though we do need to be careful of affirming the consequent). What I also find useful is how well abduction fits with Pierce's pragmatic approach (Commens Digital Companion to C. S. Peirce, 2020). What is also interesting is that 'abduction' is thought to be a corruption of retroduction. Conduction is not mentioned anywhere. It seems that, for Peirce, retroduction is conduction. Different schools, maybe.

We could plot the characteristics of argument on a continuum and see the shift from absolutism to fuzzy logic, as is shown in the accompanying illustration. I suppose someone has done this before, but it was new to me :-)

Interesting.


Sam

References:
read more "Characterising approaches to argument"

Friday, 23 October 2020

The social sciences

It is not often that we really step back and think about our own fields, and where those fields fit in relation to everything else. I was idly thinking about the social sciences, and why business in a part of that field (Sodertorn University, 2020).

Scientia is from Latin meaning ‘knowledge’, and referring “to a systematic and organized body of knowledge in any area of inquiry” (Bhattacherjee, 2012, p. 1). Science falls into two halves: natural science; and social science. The natural sciences focus on objects or phenomena occurring naturally (e.g. light, matter, our planet and beyond, or living things). Science - natural science - tends to take a more objective approach (Frey, 2013).

The social sciences are the study of people or groups (e.g. organisations, societies, economies, or behaviours), and include “psychology (the science of human behaviors), sociology (the science of social groups), and economics (the science of firms, markets, and economies)” (Bhattacherjee, 2012, p. 1). The social sciences tend to take more of a subjective approach (Frey, 2013).

OK: so what then separates the humanities and the social sciences? Professor Iain McLean pinpoints a wonderfully cogent difference: “humanities are (mostly) interested in the unique; social sciences are (mostly) interested in the general”, continuing on to provide an example, that “Social statistics cannot predict how I will vote in the next election, but they can help to predict what most people like me will do” (McLean, 20 November 2018).

These delineations are somewhat arbitrary though. Without going into the Arts, where things get even more murky, let's stay on the simple side of the divide, and consider the areas where business has a clear involvement. Depending on what is studied, and the methods used, business can easily fall into a number of different disciplines. For example, research into stock market returns can explore the connection between a range of mathematically measured factors (science); the reasons why stockbrokers invest in certain stocks (social sciences); or the narrative of a particular stockbroker (humanities).

It all comes back to where we stand (ontology), what we want to learn, and how we come to knowing (epistemology). And we could fling a bit of axiology in there as well for spice :-)


Sam

References:
read more "The social sciences"

Friday, 13 September 2019

Being Wrong, Part 2

In part one of this post on being wrong (here), I had read a sample of MIT Professor Hal Gregersen's book (2018a), and said that I would report back on his book later in the year.

It took me some time to get a copy of his book in order to review it, but the job is finally done, thanks to an Audible copy (which also means that I can't provide page numbers for the quotes, sorry!).

One of the key pieces of value I got from Hal Gregersen's book were the clear details on how to effectively brainstorm, based on his teaching experience. He proposes that success in brainstorming comes from generating questions, not from providing answers. In fact, he sees providing answers as limiting the conversation (2018a).

Gregersen (2018a) proposes that powerful leaders get to ask all the questions, while their staff are there to provide the answers. This power 'over' others may limit the conversation (Marshall, 1984), as those who are “power hungry aren’t seekers of truth; they are seekers of advantage” (Gregersen, 2018a). Ouch. It is as if an organisation can have a "toxic" way of asking questions, in addition to having toxic leadership. This is an interesting idea which the adoption of Gregersen's approach to brainstorming would help to circumvent.

Further, Gregersen suggests that asking good questions becomes synergistic, melting barriers and kick-starting creative, lateral and - in hindsight - 'obvious' solutions which would not normally have been arrived at (2018a). However, Gregersen also notes that to “question anything fundamental is by definition to challenge” (2018a), so the organisation must be willing to challenge the process, and to have what is - in essence - an authentic leadership approach (Kouzes & Posner, 2007).

The approach taken to brainstorming is detailed well, in three stages:
  1. Stage setting: Invite a group - is possible, people with "no direct experience with the problem and whose worldview is starkly different" from ours - to the brainstorming session. We pitch the problem to the group in two minutes (or less) (Gregersen, 2018a, 2018b, 2018c).
  2. Brainstorm questions: We then set a timer for four minutes, and together generate as many questions as we can about the problem. There are two rules: we don’t answer any questions, and we don’t explain why we’re asking about the problem. We aim for 15-20+ questions in the time. Record the questions word for word as they are proposed (we might need to record the session or to have several people to record) (Gregersen, 2018a, 2018b, 2018c).
  3. Identify our "quest": Review the questions together. Select "a few 'catalytic' questions", those which we think might be the most disruptive. If we don't find anything that is looking disruptive enough, we repeat step 2. Rinse repeat a third time if we have to. Then we commit to pursuing at least one new idea "as a truth seeker", not as a defender of the status quo, and see what that does for the problem. This is a very action-oriented section, requiring the "innovator's focus on the 'job to be done'" (Gregersen, 2018a, 2018b, 2018c).
And, if you are wondering, catalytic questions are those which reframe a situation so that we can see a different solution. They give us those forehead-slapping, "of course!" moments where we have no idea why we couldn't see this particular solution before. Catalytic questions allow us to make more effective and longer lasting change, which creates competitive advantage (Gregersen, 2018a). Catalytic questions give us a paradigm shift, like Elon Musk using AA batteries embedded in the chassis to power Teslas rather than an enormous heavy bulky battery which limits design options.

Professor Gregersen has been working on this approach to brainstorming for a number of years, and has had a lot of success with it. The book is very interesting, an easy read, and the brainstorming is worth a try as well.

Worth the purchase.


Sam

References:
read more "Being Wrong, Part 2"

Monday, 22 July 2019

Feeding the Noble Horse

Imagine a chariot drawn by two winged horses. Imagine that one horse is a noble steed with impeccable breeding and manners; and the other is a dark and dodgy character, a herring-gutted nag who evades, bucks, rears and naps. We, as the charioteer, are the voice of reason (Uebersax, n.d.). Plato suggests this allegory as the duality of human nature: our chariot is constantly drawn in non-aligned directions by these two horses, who have very different agendas (Wikipedia, 2 May 2019). The light and the dark; the good and the bad; angel and devil; service versus selfishness.

Like the wolf allegory attributed to North American Indians (Wikipedia, 6 May 2019), the horse which wins is the horse we feed. If we feed our "black" horse, we will reward ourselves for being lazy, we will procrastinate, we will waste our time watching cat videos and looking at shoe porn on the internet, creating our own crises. We will behave, as my Grandmother used to say, like we are "all talk and no trousers".

However, if we, as the charioteer, feed the "noble" horse, we build good habits, we reward ourselves for making good choices, we apply techniques to limit procrastination and manage our time as well as we can. Through self-awareness, we self-regulate (Goleman, 1998). We don't aim to be perfect, but to behave along noble lines.

I am thinking that this could be the year of the charioteer feeding the noble horse.


Sam

References:
read more "Feeding the Noble Horse"

Monday, 20 May 2019

Being Wrong, Part 1

I don't know about you, but I hate being wrong. It is even worse when I have climbed upon my high horse, then have to climb down again, experiencing that warm wash of shame. The more sure I have been about being 'right' the hotter the shame on facing the evidence of my failure.

But if we want to learn, we can't avoid being wrong. Being wrong means that we are learning, because learning changes us and moves us from one place to another. Learning changes our perspective. It shows us that what we think is going on is merely the top fraction of a millimetre, and to really get to grips with things, we will have to dig a lot deeper.

Being wrong is a normal risk with learning anything new. Yes, we run the risk of looking foolish, of being thought less of by our peers, of showing that we don't know it all. But when we consider that we now all pay to be students, that is a pretty silly way to think. Why would we pay to stay the same? Unless we are open to the fact that we don't yet know it all - obviously, or else why are we studying - we aren't welcoming the challenge of new ideas.

To learn, we must move ourselves from a fixed mindset to a growth mindset (Dweck, 2006). To do that, we have to be able to question our own beliefs, to seek new evidence, to analyse and to test new ideas, and to acknowledge that the new ideas may well displace our existing ones.

If they didn't, I would be worried that we had wasted our money.

So why is it that in the workplace, we like to think that the manager has all the 'right' ideas? Why do some organisations create a culture of 'don't ask questions'? That's pretty risky too, I think: and a wrong-headed way to manage.

There is a new book out, called "Questions are the Answer: A Breakthrough Approach to Your Most Vexing Problems at Work and in Life" by Hal Gregersen. It deals with being wrong, but with a view to reframing our approach, and how that strategy is very good for us (Vozza, 21 January 2019). I have read a sample chapter, and am hoping to get a copy so I can find out if it will change me.

I hope so. I will report back later in the year.


Sam

References:
read more "Being Wrong, Part 1"

Wednesday, 24 April 2019

Talking about Why

Sinek’s Golden Circle (4 May 2010)
One of my students' favourite YouTube clip discoveries year after year is a TED talk with Simon Sinek talking about what he terms "The Golden Circle".

The Golden Circle is the magic place which, in Simon Sinek's own words (4 May 2010, 4:03):
"Here's how Apple actually communicates: everything we do we believe in challenging the status quo. We believe in thinking differently. The way we challenge the status quo is by making our products beautifully designed, simple to use, and user friendly. We just happen to make great computers. Wanna buy one?"
As he says to introduce the idea of "Why", (4 May 2010, 02:22), “Every single person in every single organization on the planet knows what they do: 100%. Some know how they do it; you would you call it your differentiating value proposition, or your proprietary process, or your USP. But very, very few people or organizations know why they do what they do."

So why do we get stuff backwards? What makes us forget to tell people the 'why'? Harvey Deuschendorf (17 January 2019) suggests it is to do with our limbic brain and that we are actually run by emotions, for all we would like to think of ourselves as rational, careful decision-makers. He says "Our limbic system is one part of the brain that holds our emotions. The neocortex, on the other hand, is our powerful, thinking mind, which makes the decisions based upon information. When we make decisions, we probably like to think that we’re basing them on facts and data. But we’re making them based on our emotions, on the 'why'."

What do Black and Decker sell? Not drills, but holes. Customers want holes. What do motorbikes give us? Perceived freedom. What does face cream give us? Eternal youth. We need to truly understand what it is that we sell to our customer from the customer's point of view. We need to ask, survey, and get inside out customers' heads to find out exactly what it is that our product gives them. In education, what does our target market get? Choices? Confidence? Access? Networks? New opportunities? Knowledge? A second chance?

Until we find out, we are rudderless in being able to truly connect with our own 'why', which then means that our decisions will not lead to a strong and truly communicable brand.

We need to talk.


Sam

References:

read more "Talking about Why"

Friday, 27 July 2018

Learning for learning's sake

Tom Stanton presented a paper at the International Centre for Guidance Studies (iCeGS) 20th Anniversary Conference, held over 23rd and 24th May, at the University of Derby, UK, on where political and career views intersect.

If our ideas are neoliberalist, this may make us view career as a means for work, making 'efficient' and 'effective' use of our time. A 'responsible' approach to career would then mean that we are 'ready' for work. But should our careers align with a neoliberalist political view? Could we instead consider career as a politically free space, where we can "run against the grain by offering individuals space for free thought" (Stanton, 2018).

Earlier this year I read a UK report which was about matching graduate skills with workplace requirements. I felt that the report had an underlying implicit assumption that education is a workplace supply chain (Universities UK, 2015); which is also a neoliberal political view. Later I read a LinkedIn post and article by Cecilia Chan who seemed to be saying the same thing: that degree training was technically skill-based, with efficiency and effectiveness as key ingredients to ensure we turn out work-ready graduates: so much so that we should send academics into the 'real world' regularly to give them a reality check of what the world of work is all about.

After reading these pieces, I asked myself "what is higher education for?" Does education, as Descartes thought, cloud our mind? Or does it open it? Is education about teaching people to learn, or about training people to do? They are different processes, with different outcomes, requiring different focii...

I am of the 'free thought' approach to education, which I think is a somewhat meritocratic view. I see education as a way to teach ourselves to think, to imagine, to create, and to make great connective leaps. Education then leads to research through intellectual freedom. A by-product of undertaking research in the modern tertiary sector is that we upskill and stay in touch with modern practice within our specialty. We also do this by talking to colleagues, going to conferences, reading and writing for journals ...and being members of LinkedIn and other professional platforms. These activities are part of our performance markers at our institutions.

When we learn in a meritocracy, we can take our ability to learn, to imagine, to create and to connect out into the workplace and add value. We have depth, range and extendibility. We are not two dimensional neoliberalist single-use cogs with a fixed use-by date, destined for slotting into a machine when the previous cog fails.

Educators no longer live in ivory towers: zero hour contracts, delivery load, research outputs, customer service and funding models have put paid to that (or it is certainly so where I work). While there are probably some who can probably still gamify the system and evade upskilling, I suspect that these are the 'in single digit percentage' exceptions, not the rule.

Then there were those trusty old professors who thought that education's value lay in learning for learning's sake. They inspired, helped us learn because the learning itself was the challenge. They helped us to explore, to synthesise, to ask questions because we could. And much or our research happened because we were free to consider. We took the time to think. I think these profs are extinct... but I wish they weren't, because I like their approach.

I long for a return to learning for learning's sake. Slow learning. Career consideration. Time to imagine.


Sam

References:
read more "Learning for learning's sake"

Wednesday, 16 May 2018

What is a degree for?

Recently I ran across a interview about higher education, where the interviewee was suggesting that Higher Education was a waste of time, resources and investment. The interview was about a book that the interviewee had written, "The Case Against Education: Why the education system is a waste of time and money". The argument was that "public education is waste of time and money and we should stop investing in it" because "the payoff for education isn’t really coming from learning useful job skills. Nor is it coming from students savoring the educational experience. Rather, most of what’s going on is that people are showing off — or, as economists call it, they are 'signaling'. They are trying to impress future employers by showing how dedicated they are" (Illing, 16 February 2018, citing economist, Bryan Caplan).

That's an argument I haven't heard before: education as 'showing off'. I do think that completing an undergraduate degree that tells an employer that you can stick at something for 3-4 years and get it done. Tenacity. Perseverance. Completer-Finisher. But I don't think that this is 'showing off'.

Putting aside my doubt about the validity of the 'showing off' statement, I feel that Mr Caplan's argument has another flaw: that the job of higher education is solely to make us employees. He also thinks that we should all pay for our education, right the way through from Kindergarten. He thinks that "Kindergarten through 8th grade tends to serve as a daycare center for kids while their parents are at work. The educational waste really becomes a problem in high school because at that age kids could be doing something far more productive, like an apprenticeship or a vocational school" (Illing, 16 February 2018, citing Bryan Caplan).

Wow. Early learning is day-care. Being in secondary school is why people are no longer doing apprenticeships. Crikey, talk about an 18th century approach to education. I was amazed that Princeton published it. OK: I am probably being somewhat unkind, as Mr Caplan does imply that some students will be "savoring the educational experience", but the main thrust of his argument seems to be an economic one, and not a developmental one.

Funnily enough, I don't think that 'day-care' and 'vocational training' present valid arguments either. Nor do those who sign up apprentices think this way. The minimum age is 15 to begin an apprenticeship here in New Zealand, requiring a dispensation to get out of school a year early. There are some sound reasons why we don't want younger people going into apprenticeships: there is good evidence that our brains do not fully understand consequences until we are 18 or more (Moffitt, Poulton, & Caspi, 2013). Health and safety, customer service, process and procedure, and self-directed learning all tend to be better understood if we just hold off the start by a couple of years to get a little more maturity.

I did my degree for interest. Yes, I assumed I would find a job, but primarily, it was so I had some training in something that I was interested in. In my view, higher education is there to train us to refine our thinking, our curiousity and to develop an understanding of what professionalism looks like in our field. For me, a degree is to teach us to learn, not to create drones who can get a job.

If my summary of the article is correct, then Mr Caplan's argument is that the sole purpose of higher education is to get work. I assume an educational and economic fail on his terms would be if we didn't get a job in our field.

In my view, an educational fail is if we don't develop life-long learning... that and the understanding that no learning is ever wasted. Quite a different philosophical approach.

Hmm. I wonder what Mr Caplan's qualifications are...


Sam

References:
read more "What is a degree for?"

Monday, 29 May 2017

Knowledge versus Information

I read a great quote the other day, from Canadian novelist, Louise Penny.

In one of her Inspector Gamache novels, "Bury Your Dead" one of her support characters, Inspector Langlois thinks that a particular library
"smelled of the past, of a time before computers, before information was 'Googled' and 'blogged.' Before laptops and BlackBerries and all the other tools that mistook information for knowledge" (2010, p. 58).
I thought that this phrase was quite telling. Why? Because I think we regularly mistake information for knowledge.

Often.

Knowledge is defined as the "facts, information, and skills acquired through experience or education; the theoretical or practical understanding" (Google, 2017b). It implies mastery of something.

Information is, as we can see by the definition above, a subset of knowledge. Information are the "facts provided or learned about something or someone" ((Google, 2017a).

A deep and abiding knowledge of something can be fudged for a short time by using superficial information. However, when we need to tackle something that is not in the ordinary run of things, we need knowledge. We need knowledge in the sense of understanding, command, or mastery.

We can give students lots of information via MOOCs, but until they apply the learning, trial it, fail at it a few times and finally master it, it remains information and does not become knowledge.

Nothing beats careful, complete and diligent scaffolded learning.

And that is probably why MOOCs on their own will not replace degree programmes in the near future.


Sam

References:
read more "Knowledge versus Information"