Tuesday, December 21, 2010

"What Value for a Degree?" Part 2: Inherent value.

To continue from yesterday's post about the "relative value" created when education is a scarce commodity, today I'll write about inherent value--that which we are assumed to obtain simply by completing an educational credential.

Governments are concerned with developing "human capital", which is the value of the workforce as measured by people's skills and capacities for economic production. The argument is that the “knowledge economy” requires more and different skills of the workforce. This assumes that everyone should have more education because education will develop these skills (as economic value that resides in people). So by extension, there is an assumption that education has an inherent value—as something that contributes to the economy through the gross increase of human capital—no matter whether there are better jobs waiting for the graduates.

An assumption of inherent value also means that a financial payoff is assumed for the individual—so there is (economic) value in education for the individual student (or graduate, at least). This dovetails with the current (neo-liberal) policy trend of privatising the sources of PSE funding, including through raising tuition fees. Individual value means individual benefit, and therefore individuals should pay for this benefit.

But as discussed in my previous post, education does not benefit every student equally, so taking an “average” increase to earnings over a lifetime—which is the most frequent means used to “prove” the monetary worth of an investment in PSE—is not the best means of assessing the positive effects of higher education for the most vulnerable/least privileged students, who could benefit most significantly from them.

---------------------------------------------------------------------

In government policy there seems to be a confusion between an inherent value created by a university education (i.e. skills, training, knowledge) and the relative value of a scarce commodity. But what does this difference in concepts of “value” mean when it comes to public debates about education, and the kinds of policies that generate and are in turn influenced by those debates-?

It tends to mean that we fight for university accessibility primarily in the form of increased enrollments, then wonder why attrition rates are so high and why so many students seem to “fail” at maximizing the resources provided by universities (such as student services). It means that governments create targets for the number of university graduates to be “produced” and for the percentage of the workforce that should possess a degree, assuming the additional human capital will generate returns to national economic success--but that many graduates nonetheless find themselves struggling to get work due to a lack of jobs appropriate to their level of education. Never mind ballooning debt loads, since personal financial “returns” to education should take care of this (unequally distributed) burden.

But if there is no job waiting at the end of an expensive degree, then the personal “investment” made by the student is seen as a failed venture for which s/he takes primary responsibility (particularly if student debt is involved).

In the UK right now we can see a clear example of this logic at work. As the system has expanded continuing to use the elite model of governance, costs have increased while the economy has become increasingly volatile. Government response is to radically reduce funding for teaching and to allow universities to raise tuition. Students are told they must now pay for something that in the past was more or less free (i.e. for their parents), a situation that creates inter-generational resentment, producing as it does a lopsided distribution of payment for the lingering costs of expansion.

Yet students will continue to enroll (if places are provided), since university degrees are considered more necessary now, for more people, than ever in the past. It seems that the cost of education rises, and indeed the value diminishes, with increased demand--the opposite of how markets are supposed to work.

Monday, December 20, 2010

"What value for a degree?" Part 1: Relative Value

A friend of mine, who teaches at an English-speaking middle school in Hong Kong, recently asked me if I think too many people are going to college (university).

I think about this a lot, since completion and participation targets are often in the PSE news and in policy. I always find it a hard question to answer—partly because answer means asking ourselves about the purpose of a university education, and what precisely it is about university degrees that they are somehow assumed to equip young people with what it takes to succeed (economically) in the world. What is it that makes a university degree valuable and why is this important?

The focus for students, parents and governments is significantly economic, in policy and in practice—something that has become more the case over time as universities have moved towards “massification” (expansion) and more emphasis on private sources of funding (including tuition).

The benefits of post-secondary (and particularly university) education are expected to increase both the prosperity of individuals and the competitiveness of the national economy. So why is it important to question both the "graduation imperative" as economic policy, and the "accessibility" ideal as progressive social policy?

While in the past it was true that people who earned university degrees then went on to have more economic success, this was partly because university education was an elite education. No more than 5 to 10% of the population had a degree, so it was a valuable thing to have. Higher education usually meant training to be part of an elite; for example, the traditional “liberal education” was training for a small, privileged group who would become the “leaders of society” in law, politics and business.

In a sense, we’re now saying that as many people as possible should have an education of this kind, which means that by definition a university degree ceases to be “elite” in the way described, or to provide any value based on scarcity. This doesn’t mean there is no other kind of value—only that a degree will no longer provide the benefits of a scarce commodity (to the extent that it did in the past). It also means that universities are and will be using more tactics to explicitly demonstrate the value of what they offer (marketing, advertising).

In a system in which we rank and label people, a lack of obvious comparative value creates a problem, since we need to differentiate in order to allocate. If in the past the university degree acted as a filtration mechanism or a stamp of elite approval, it was the case that you had to have money, family/social connections, and/or a lot of smarts and savvy to get one. But how does this “filtering” happen when everyone gets a degree?

The cynical (or perhaps realistic) answer is that a relatively “elite” group will still form, and it does; filtration still happens because our system is driven by a capitalist economic model that works as a hierarchy driven by competition. People are ranked (using grades, for example), and it’s understood that this is more or less a zero-sum game. And some people still start out with far, far more than others when it comes to securing the highest spots in that ranking.

Yet most education systems are premised at least to some extent on the concept of meritocracy, the idea that people succeed based on “merit” or “excellence” alone, rather than through forms of extrinsic (often material) advantage. Though we have plenty of examples to support the idea that meritocracy functions fairly—e.g. working-class kids who “make good”—the wealthier and well-connected students still tend to get the best jobs in the current climate, no matter how many others may have university degrees. And from the inside, it tends look like this is because of differences in cultural, social and economic capital, rather than "merit" alone.

-----------------------------------------

Coming up soon, in Part 2! Does education have an inherent value in this context? Human capital, personal 'investment', and unequal/unexpected 'returns'.

Sunday, December 19, 2010

Performing Professors

As an amusing follow-up to the last post, about humour, here is an article that coincidentally appeared in the New York Times, discussing professors who have been using stand-up comedy as a means of diversifying the "audiences" for their research work. Fascinating to see "public science" being taken into this arena...

Friday, December 10, 2010

Go on, have a laugh

This week’s long and rambling post, after a hiatus of about a month, comes out of my thoughts about the tutorial group I’ve been working with this term.

After each class, on the bus ride home, I think through the things that seemed to work and the things that didn’t. Which students were really engaged in class, and who was tuned out, playing on a laptop or sending text messages? Did we use media in the class and did that work well for the group? Did we look in a deeper way at the key points from the week’s readings, or did we spend a lot of time on irrelevant tangents? Perhaps most important, what was the overall dynamic in the room and did it help or hinder the discussion of issues important to the course?

Last week, I was “chuffed” when a student said she had remembered the meaning of a term based on a joke (a humourous anecdote) I had told about it. Her comment made me think about how humour is something I use in class, in a number of ways according to context—and I realise now that I’ve been 'using' it right from the moment I stepped into a classroom to teach for the first time. It turns out that my teaching role models are my favourite stand-up comedians as well as the best professors.

This led me to ask: What's the function of humour in the classroom?

The more I thought about it, the more I realized that humour, being humour, simply isn’t taken seriously as a pedagogical tool.

And yet there's a use for it. When I was first learning how to lead tutorials, humour had the function if dissipating my own sense of awkwardness at the situation. Since I wasn’t used to taking on authority, and didn’t feel comfortable with that role (i.e. the kinds of expectations there were from the students), the laughter made it easier for me to deflect and dissolve my own anxiety and that of the students as well as creating a “cushion” for those times when I felt incompetent and unhelpful (usually this was just my own perception as I later learned). Another effect was that students seemed to feel more comfortable in a classroom where a few laughs were encouraged.

To me, humour has also been a means of highlighting the ridiculousness of 'normality', which is an entry point to critique (for example I showed this sketch in tutorial, as a way of addressing essentialism). I can't count the number of times I've found myself inadvertently 'opening up' (making accessible) a perfectly 'serious' issue by making a joke.

Humour is an important strategy when lecturing with a large class, as well. In some ways, the skills demonstrated by stand-up comedians could be seen as a pretty fair fit with those required of lecturers in the university setting--keeping the attention of a large audience for a couple of hours without them being distracted, in such a way that afterwards they somehow remember what you talked about. Those skills are applicable across boundaries. And just as many professors make jokes about their academic material, many of the best comedians have a serious point driving their work.

Two of my favourite performers of stand-up comedy are Bill Bailey and Dylan Moran. Like all successful stand-ups, Bailey (who is English) and Moran (Irish) have 'trademark' on-stage styles. From Moran's shows, what strikes me in terms of applicability to teaching are his uses of narrative, creative language, and vocal modulation. In this clip, he discusses the idea of having untapped personal "potential": "leave [it] absolutely alone", he advises, before launching into a lengthy, fantastically detailed description of what you imagine your potential to be ("flamingos serving drinks")--as opposed to what it actually is. Like the best lectures, this performance is impossible to re-create through quotes alone because Moran's style is the greater part of what makes the material funny and engaging.

Bill Bailey, on the other hand, has a way of soliciting responses from the audience and incorporating them into his act; he also takes slight in-the-moment thoughts and accidental slips and turns them into commentary and productive tangents. In one section of his show "Part Troll", he involves the audience in making the sound of "a giant breaking a twig", then invites them to shout out the names of famous vegetarians (which he re-imagines as a horse-race). Bailey has a knack for creatively incorporating the unexpected into his 'act', in ways that generate relevant connections without losing the overall 'thread'. I think this translates as an important classroom skill because it can help to involve students in a discussion, if we can relate their contributions, their experiences and examples, to a theme that's part of the course--without 'losing' the point at hand.

I don't consider teaching to be all 'performance'--and not all humour is helpful or appropriate in the classroom. But after watching so many tedious, montonous lectures in which students (in some ways justifiably) tuned out of the course and in to their iPhones and laptops, I've developed an appreciation for presentation--and I'll take my role models where I can find them-!

Thursday, December 9, 2010

Passing Around the Kudos

This week I was given a rather generous mention in someone else's blog, specifically the excellent Margin Notes (a University Affairs blog) written by Léo Charbonneau (@Margin_Notes on Twitter). Thanks for that, Léo! And now in the cheery spirit of "tag! You're it", I thought I'd share some of my own favourite higher education news resources; this includes blogs, Twitterers, and websites, all parts of the odds-and-ends collection of sources from which I draw my daily gulp of PSE news and commentary.

Jo Van Every's blog is a great resource; Jo is an "academic career coach", and her blog offers great career advice whether you're a grad student or a mid-career academic. She's is also a great conversationalist on Twitter (@jovanevery).

I also recommend College Ready Writing by Lee Skallerup Bessette, who is another prolific Tweeter (@readywriting) as well as somehow finding the time to teach full-time at the post-secondary level and to write many excellent blog posts about pedagogy, writing, and academic career choices.

Higher Ed Watch--never heard of a "sub-prime student loan"? Check out this excellent (U.S.) policy blog, where there is regular critical and detailed commentary about for-profit colleges, student loans, and other aspects of post-secondary governance and political economy.

Inside Higher Ed, another U.S. site with an impressive round-up of PSE news every day. The site also includes a series of blogs dedicated to commentary on specific topics in higher education.

Hook and Eye and University of Venus (here at Inside Higher Ed)--both blogs are written by and about women in the academy, and both offer a range of thoughtful contributions from regular editors and guest bloggers (@fishhookopeneye, @UVenus).

Two feeds I have plugged in to on Google Reader are the "news" and "media scan" feeds from University Affairs Magazine (@UA_Magazine)--an efficient way to keep up with the latest in Canadian PSE news.

I'd like also to make a point of dropping the names of a few fellow Tweeters who've participated in some pretty interesting PSE-themed conversations over the past wee while: this includes Mary Churchill (@mary_churchill), one of the founders of University of Venus; Mary-Helen Ward (@witty_knitter); and Janni Aragon (@janniaragon), as well as Jo Van Every and Lee Skallerup Bessette (see above).

And last but obviously not least, for an interesting blog-in-the-life of a Canadian university professor, check out my friend Alex Sevigny's blog (he's also on Twitter at @alexsevigny). Alex writes about his experiences as a prof and a professional communicator.

Thursday, November 11, 2010

A Source of Revenue?

I finally got around to reading Daniel Wolfe's article about internationalisation, but this is what confuses me, and always has, regarding the recruitment of international students to Canada...

"I think the reasons for internationalization are many, and bringing in extra tuition revenue is one of them, undeniably (international undergraduates, other than exchange students, pay increased tuition to reflect the fact that there is no government operating funding provided for international enrolment)."

If the government isn't contributing any funding towards supporting these students, then surely the extra tuition we charge them doesn't really count as "extra" because it's money that the government would have provided (had they been domestic students)? So in effect we charge them more because they cost more? Unless they are really scalping these students and adding huge amounts to their tuition (i.e. a lot more than what the government would provide), then this doesn't sound like a revenue stream to me. But perhaps that is what they're doing--I'm not sure (please answer in the comments if you know!).

International students often require special resources above and beyond what domestic students usually need, so there are other costs that detract from this "profit" as well.

And if Canadian university expansion occurs to accommodate more international students, what will happen if and when those students' "source countries" develop the capacity to educate locally? Unless of course we just want to keep exporting our Western Brand™ of university education and never expect those countries to develop their own "knowledge infrastructure". I just get the sense that we're trying to use these students to shore up gaps in future cohorts that we know are going to decline, because of demographic trends. We are still treating these students as a "market" and markets fluctuate; are we actually planning for that or are we just heading into further financial sketchiness?

Wednesday, November 10, 2010

Intractable Problems

In the UK today, students and faculty protest extreme cuts to funding for university teaching, with targeted exemptions (STEM subjects). Nick Clegg of the Liberal Democrats has predictably been unable to prevent the Conservative government's wholehearted adoption of the recommendations in the recent Browne Report.

You've seen it before--in Canada every year the CFS has rallies around the country protesting tuition fees; in California students stages protests and sit-ins in response to massive funding cuts; now in the UK, students and faculty are rallying in reaction to the policy bomb dropped by Cameron's government.

But the cuts happen anyway--in California, around the U.S., in England and soon enough, Canada as well (this depends on the situation with transfer payments; technically university operating budgets are a matter of provincial jurisdiction, and tuition is set at the institutional level).

At this stage, I'm not interested in launching into a diatribe about the uselessness of activism--because I don't really buy that argument. Activism of the kind I'm describing has had positive results in the past.

What I want to know is how we can engage politically in ways that prevent these kinds of "solutions"--massive cuts, for example, and related retooling of university governance--from being either required or imposed. Because the situation we're in now with university funding is one that's evolved over a period of about 40 years or more. Surely during that time, students and faculty could have been aware of changes happening? Or did we simply not realise until it was too late?

And then there's the economy--the rise and fall, boom and bust, starting with severe recessions in the 1970s and continuing through to the most recent "downturn" beginning in 2008. Why are universities (indeed, governments and banks) incapable of weathering these economic storms? Why is it that each time the axe falls, education--in spite of its apparent relation to economic prosperity--seems to be one of the first areas up for the chop?

Even after spending a fair bit of time with these problems over the last five years or so, I feel bound up as if by a web when confronted by these kinds of policy quandaries, which are central to the governance of universities and which have such real effects on people's everyday lives. Today's protests in the UK remind me of just how far we have to go yet before the answers are in sight.

Tuesday, November 2, 2010

Creative Thinking

Lately, I've been thinking more about the nature of "creativity" or what it means to "be creative"--probably because there's been an increasing amount of conversation about education and creativity, relating these things to the development of solutions to pressing social, economic and ethical problems.

One of the reasons I find it hard to imagine "teaching creativity" is that I've never not been "creative" myself. I've always been one of those people who was labelled as such fairly early in life, and in some ways that's made it harder for me to form an impression of creativity beyond the ways in which people tend to apply the term to me. I think the labelling also highlights the way that some talents (such as my ability to draw and paint) are associated with creativity, while others (a gift for numbers) might not be.

Another reason I find it hard to think about teaching creativity is that I still haven't seen a convincing working definition of the term. My own definition, as far as I can think of one, would involve primarily three things:

Critical questions: It's hard to be creative if you just accept what is already "there", without thinking. Being critical is not just about identifying problems (for example), it's also a process of questioning the assumptions underlying the problems and assessing the worth of various potential solutions.

Imagination: Criticism turns to nihilism or stagnation when one cannot "imagine" a solution. We need to be able to see the possibility of another way of doing things, beyond what's immediately evident.

Knowledge and understanding: You cannot do something new and inventive and helpful, or imagine a possibility and bring it to fruition, or make reasonable judgments, when you don't have a good knowledge base and an understanding of the tools available. This is the case whether you're a ceramicist trying to determine the appropriate kiln temperature for a glaze firing or a policy-maker analysing the various options available for financing social services.

It matters how these terms are used, how words like "creativity" are defined, because of the salience of the concept in current political and economic discourse--in particular its perceived relevance to the much-theorised "knowledge economy". What kind of policy proposals will be put forth in an effort to increase "creativity"? On what assumptions will these suggestions be based?

Much of the time "creativity" being slotted into a kind of ideal trajectory of (economic) development, one that involves innovation, entrepreneurialism, economic efficiency and productivity, and national competitiveness (a good example of this is the analysis from Richard Florida, who has popularised the term "creative class" and whose work focusses on the economic benefits of creative work).

This means that there's likely to be a preferred definition of creativity, one that fits with the trajectory--an ideal "creativity" that produces economic competitiveness as its ultimate outcome. In this case, what comes first?--policy or the definition of "creativity"?

All this is important for education policy because creativity is often linked to the public discussion about the "failure" of schools. Education, which has so often been treated as social engineering, is imagined as the best way to retool the workforce (human capital) for an "innovative" economy.

A useful example of this approach is that of Sir Ken Robinson, a prominent lecturer and consultant whose well-known talk for TED is a celebration of the inherent creativity of small children and an analysis of how the school system destroys said innate creativity.

In another video, Robinson argues that creativity can be assessed. How? By assuming a particular definition. Creativity is "not an abstraction--to be creative you have to be doing something." So Robinson defines creativity as "a practical process of making something", the "process of having original ideas that have value." Originality points to the emphasis on newness and innovation, while value assumes the possibility of assessment; creativity can be assessed through determining the field and employing clear criteria that are relevant to that field. Robinson also stresses that assessment is both a description and a comparison of creative work.

I wrote out my own definition before listening to Robinson's talk. I think it's interesting that while he describes creativity as a "process", he seems to be concerned primarily with the outcome of the process ("ideas that have value"). He also doesn't delve into the ways in which different kinds of knowledge are valued differently, and how even within fields, ideas do not exist within a kind of meritocratic marketplace. Comparison and assessment are fundamental to the market as a mechanism of governance, so one could argue that Robinson's emphasis reflects an economic basis for the concern with what children "produce" at school. It also feeds into a decades-old discourse of criticism of public school systems, one that has been notoriously unhelpful in producing better schools.

In coming up with a definition for "creativity", I think we need to ask within what system of valuation "creativity" exists--and the ways that system affects how creativity is thought about and defined. What kinds of "creativity" are seen as appropriate, productive? And what does it mean for education when a constant public discourse of critique takes up such nebulous, catchy/catchall terms, which are in turn mobilised and reified in specific forms through policy debates (such as those occurring currently in the United States)?

Sunday, October 24, 2010

Technology and Research, Part 2: Tweeting and Blogging

Continuing my little discussion of the ways in which I've most recently been using online technologies in my daily research and writing habits, today I'm moving on to the complementary combo of Twitter and Blogger.

Since one of my goals over the past six to eight months has been to interact more with people who share my research/academic interests (outside of my graduate program), I've been doing more social media exploration than usual. A relatively recent major change to my online habits has been my increasing use of Twitter as a way of connecting with strangers and keeping up with news.

I operate with a kind of minimalism when it comes to technological tools--as I mentioned in a previous post, I tend to want only the tools I need, and only the tools that work. It's for that reason that I (and others) didn't start using Twitter until quite a while after I first looked at the site and logged on to create an account. I simply couldn't see any point; like so many people, at first I thought of Twitter as a useless stream of trivial chatter that would only further clutter my already-limited field of attention.

In spite of my own skepticism, at some point earlier this year I decided to try "tweeting" a bit more in earnest. Since that time I've decided that there are "two Twitters": the banal barrage of idiotic celebrity gossip and predictably dreary/melodramatic personal updates, yes, that Twitter does exist (of course!). But the flip side of it is a fascinating and wide-reaching series of exchanges, often with people I'd never have encountered otherwise; it's a stream of useful news and links that I couldn't possibly have rounded up on my own; and it's a means of responding to those things, and sharing my own, in such a way that the conversation continues and expands.

But it does take time to learn how to use Twitter effectively as a tool--assuming you know what you want to accomplish with it. At first, without a list of "followers" and with no sense of who else was using this tool and what they might be doing, I felt as if I was sending messages into the aether with little idea of "audience", tone, or purpose. Fortunately I had a few friends already tweeting busily, who helped set an example for me in terms of Twitterquette.

Among the more important things I learned was that while it's more or less true that the more accounts you add to your own list, the more "followers" you're likely to gain, the best way to get the most out of Twitter is by participating actively. For example, a means of navigating Twitter is through using "hashtags", or words/terms attached to a tweet with a # sign: e.g. #CdnPSE for "Canadian post-secondary education". You can "meet" other followers by using tags, and interact with them by "replying" to their tweets or by "re-tweeting" them (passing their content around). A system of crediting others is integral to all this; another aspect is that of suggesting users to other users (often with the tag "FollowFriday or #FF). I found that one of the biggest challenges here was feeling confident to interact with strangers, but once I was over that hurdle things became much more rewarding.

To sum up: I like using Twitter because it affords a form of participation in an ongoing conversation, but it's one that isn't limited to--for example--my Facebook contacts, who are an entirely different group. While on Facebook I keep things generally quite private, on Twitter I'm happy to see strangers adding me to their lists--unless they're bots or marketers. (Now the only thing I can't find, or haven't found yet, is the perfect Twitter client. But that's a whole different blog post...)

Tweeting got just a little bit easier a couple of months ago when del.icio.us (as mentioned in the previous post) also linked to the site, so now you can bookmark, tag, and send a link to Twitter--with a comment--all in the same pop-up window within your web browser (for Firefox, anyway). The other way I access the daily news is through Google Reader, so now I have a Reader-->del.icio.us-->Twitter process that works pretty well for finding and reading relevant news, saving articles for later, and sharing them with people who are likely to want to read them.

And lastly, there's the blog. Even as an ex-zinester I've never felt comfortable writing blogs; the required regularity felt somehow journal-like, and I'm terrible at keeping journals. So I began, in fact, with a photo-blog that was at first a daily affair but eventually became weekly as the posts grew longer and often incorporated multiple pictures. A year later, after I'd managed to maintain Panoptikal and even pick up a few "viewers", I decided to incorporate my academic interests and my new Twitter habit by starting an education-oriented blog (the one you're currently reading), with the goal of practicing writing outside a formal academic context.

I've found that the blog is a great place to say something shorter and less formal than I would in an academic paper or presentation. It's a place to brainstorm without pressure, a venue for painting a small picture of my own views and for developing them further, and conversing with others about the issues raised. It's also something expressly public, so it's accessible for those who can't view journal articles or even private web sites where such conversations might happen in a more regulated environment (for example, Facebook). For anyone considering becoming an academic, the public nature of blogs can be a means of reaching a broader audience, of "engaging" multiple publics in the conversation about your research--and seeing immediate commentary. To keep building on that conversation, I embedded my Twitter feed and a list of links from del.icio.us into the blog's format.

At this stage you may be thinking--this sounds like a lot of effort; what's the point of all this reading and commenting and tweeting? The interesting thing is that I wasn't sure myself, for quite some time, why I was "doing all this". But I got more of an idea this past Friday when I got to sit in on a workshop run by Alex Sévigny, a friend who also happens to be a successful professor, a professional communicator, and a prolific blogger and social media buff.

The overall event, organised by Hamilton's Cossart Exchange, was ostensibly for graduate students who are interested in developing non-academic careers. But I think Alex's message was valid beyond its immediate context. His point was that for those people operating outside of existing/rigid employment structures, the process of "self-branding" (as unpleasant as it may sound) has become an integral part of professional success. Before social media, this was more difficult; but now that so many of us have access to social media tools, the opportunities have expanded dramatically. Development of an online "identity" or "face" helps you to make yourself known to potential employers and collaborators, and helps you connect better with those you've already met.

So it turns out that maybe there has been a use for all my blogging and tweeting, one beyond the immediate gratification of chatting with strangers about the things that interest me most. And here's the lesson for grad students: so many of us are spending too much time online anyway, we should really learn how to channel those efforts and make them count towards career-building (!).

Wednesday, October 13, 2010

The Down-side of Technology? On Class Time.

I want to raise a topic that of course has no easy answers, but which has been coming up quite a bit recently in my job as a teaching assistant for a lecture class of about 100 students. I know many others have discussed this too, so I'm just adding another thread to the long conversation.

Last week in class--in the lecture right before the tutorial I teach--I sat in the back row, as is now my habit, and a fellow TA sat next to me. In the second half of this particular class there was a film being shown. During the film, some students chatted, other used their computers to look at Facebook or other popular sites, and/or to chat online with friends (this they do every class), and hardly any of them took notes even though the film's content will be on the exam. From where we were seated, we could also see many students thoroughly tuned in to their mobile devices (Blackberrys, iPhones etc.).

The main reason that we were paying attention to this is that the instructor had asked the students not to use Facebook during lecture. Her reasoning, simplified, is that while it's more or less each student's personal choice whether or not to engage with the class (student responsibility), other students might be distracted by your Facebooking activity--so it is about respect for one's classmates, as well.

However, this logic has failed; in our class, it's not unusual to see students wearing their ear buds during lecture and watching videos on their laptops.

After last week's class we (the course director and TAs) had a discussion over email about how to handle the students' use of these technologies in the classroom. The question is both a pedagogical and a pragmatic one: what model of learning underlies our reaction to the students' "offtask behaviour", what will the reaction be? What is the next step forward from the argument about "respect" (such a painful position to abandon)?

To me this is not really an issue about the technology per se. After all, when students had only a pen and paper they could still indulge in the habits of doodling or daydreaming or writing and passing notes (as pointed out by this author). In our class, private conversations happen during lecture and there is laughter at inappropriate moments, showing that students either weren't listening or didn't care about what was being said. It's not that new technologies create rudeness or boredom; they just hugely expand the range of distractions in which students can engage, and they do it in a way that's difficult to censure explicitly (you can't take away a student's mobile phone).

Not only is technology not the only "culprit"--it's also not the case that all students who use Facebook or surf the web are "tuned out" of class; they may be looking up something related to the course, for example, or otherwise using technology to add to their learning experience. Pedagogically, there are many ways for instructors to make use of technology in the classroom--but I think it can only happen when students are already interested and motivated, and keen to interact in class.

A well-known example is that of a professor in the United States who collaborated with a class to create this video, one in which certain relevant points about technology and education are conveniently highlighted--even as students are engaging actively in the solution to their own problems (more info and discussion here). The video "went viral" on YouTube--providing a great demonstration of students and faculty engaging with the world "beyond" the university and doing it through making their own media content.

How can we create this kind of engagement, which has to come from students, not just from professors? How do we convey the "rules of the game", which require student participation, without being forceful, pedantic or dictatorial, without fostering resentment? It seems strange to ask students to participate in their own education.

I'm still a student myself--and I know I need to bring something to the educational equation (interest, energy, effort, attention, a desire to learn, a degree of self-discipline) or the result will be negative. There must be a balance of responsibility, between what the professor or teacher does--what the university provides--and what students need to do for themselves. Consumerist attitudes towards education (encouraged by high tuition fees) and the imperative to "edutainment" are skewing this balance as a marketised, customer-service model becomes more the norm at universities; yet so often in the past it has been slumped too far towards the weighty dictates of the institution alone.

As someone teaching--even as a lowly tutorial leader--my observation is that practices of "dealing with" changing student attitudes often happens through a kind of informed yet haphazard, everyday decision-making, through experiential negotiation of the common ground shared by ethics and praxis, driven by a need to act in the immediate present, to be proficient at teaching in a classroom. The loss of students' attention feels like failure of a kind, but what does one have to do in order to "succeed"?

And so to return to the immediate problem, what should my colleagues and I do about our "classroom management" troubles? Should technology such as laptops or wireless Internet access be banned outright from the classroom? Such tactics feel paternalistic. Are there other ways of working with students to create a better environment for interaction and learning, such as making rules and setting parameters? What about when students don't want to work--how do we walk the peculiar line between exercising "authority" and asking people to exercise authority over themselves?

Thursday, October 7, 2010

Technology and Research, Part 1: My Obsession.

Perhaps it's my background in visual art that makes me more prone to this, but for much of my life I've been suffering from pack-rat-itis. For example, I still maintain (though adding less to it now) my large collection of clipped images and texts from magazines and other paper publications. I keep a stash of various art supplies and a stocked "toolbox" with everything from string to copper wire to paintbrushes and tape measures. I've acquired a collection of notebooks and sketchbooks over the years and I keep these as well, as records and notes about ideas and projects both finished and unfinished.

And yet there's a sort of competing tendency that keeps things in check: I'm also one of those people who loves the storage and organization section of IKEA, because I like the thought of keeping practical items handy in such a way that I can easily reach them and use them. I hate having mounds of stuff and no way to do anything with it; I dislike even receiving gifts if they have no useful purpose and simply require "storage" (sitting on a shelf). I don't even see the point of having two of the same kind of screwdriver. Periodically I "purge" my supplies (usually when I move house) to make sure I'm not holding on to anything completely useless. My need for workable space may occasionally collide with the squirrelly tendency, but usually the one cancels out the other.

These habits have been transferred, now, to the work I do researching for my dissertation and other projects. Not only do I stash books and papers; my computer "desktop" itself has become a version of the way I'd probably organise my apartment if it were possible--everything is kept filed away, labelled clearly and in embedded folders, but everything is kept. And I'm finally at the stage where this habit is starting to pay off: I have a searchable library of notes and PDF files to which I can refer while working on the next phase of my dissertation. It looks slightly over-done to the casual observer, but then what is academic work if not retentive?

The latest manifestation of all this, and one that has become like a third arm to me when it comes to online research, is the social bookmarking tool del.icio.us. This little slice of magic won me over when I realised that all my current, browser based bookmarks--which couldn't be accessed from multiple computers--could be a) uploaded with minimal effort and b) tagged (categorised and labelled with key words), by me, in such a way that they would become useful.

Not only is del.icio.us a powerful tool for sharing things with others and seeing what others are reading; it is--more important to me--a means of creating a personal database of web-based content, accessible from any computer I happen to be using. Why is this desirable? Because I view the web as a major part of my research process, not only in terms of finding the materials I need (books, journal articles, etc.) and connecting with new people (including academics, writers, politicians and policy-makers) but also as a one-stop supersource for media content and information/commentary on current events--crucial to my interest in universities, post-secondary education, politics and policy, and the ways in which ideas about these things circulate discursively.

del.icio.us also has some pretty desirable features that make it easy to incorporate into my daily news-reading habits. As I mentioned above, existing browser-based bookmarks can be imported, saving a lot of duplicated effort (I was able to use about 4 years' worth of saved links). There is also an extension integrating del.icio.us into your (Firefox) browser, so that clicking on a single button allows you to tag and comment on something before saving it to your account; the same extension allows you to search existing tags in a side-bar. The list of PSE links at the left-hand side of this blog page is channelled to Blogger from del.icio.us as well, showing only those recent links tagged as relating to PSE. As you can tell, the tagging system is key to the usefulness of del.icio.us, and I soon developed my own strategy for maximising the usefulness of tagging.

And while all this seems like a lot of work, it really isn't--compared to the ways in which it's paying off. During the York University strike over 2008-2009, I tagged/bookmarked over 300 news items--press releases, articles and blog posts--which I was able to use later for a media analysis that became a conference presentation. I've saved clusters of articles on a series of specific themes that will work as media case studies in the future (possibly for publications); one of these I've already used in a class lecture on Critical Discourse Analysis. And then there's the usefulness of simply being able to access "that article" that you read two months ago, the one about gender and accessibility and women's pay (for example), and bring it in to class or into a paper or blog post or--you name it. I see this not only as a way of keeping up to date with current developments in the "field", but also as a means of enriching what I'm writing by referencing a more diverse array of sources.

del.icio.us is one of those Web 2.0 tools that makes me feel blessed to be researching in the Internet Era. And, I admit, it's also just a teeny bit enjoyable to be able to justify my storage and organization "habit" (hobby? Obsession?) as a means of actually advancing/enhancing my own research work.

---------------------

Coming up soon, in Part 2: Why I like "Tweeting" and "Googling"...a few comments on the Internet, connectivity and interdisciplinarity.

Monday, October 4, 2010

Writing it Out

At the risk of drifting into the Dull Squalid Waters of Graduate Student Angst, today I'm going to talk about writer's block--possibly as a means of getting around it. Now that's creative! ;-)

In my case, getting stuck on process is something that often comes from insecurity, a fear of "acting" and "just getting things done"; so I've tried to work at my own writing strategies over the years. But this kind of detailed thinking-through and development of self-knowledge isn't necessarily something we see being explored in graduate school (for various reasons--see my previous posts about related issues), possibly because writing help and development are often assumed to happen during the student's coursework (unless there are no courses) or at the university writing centre. It may even be assumed that students should have learned how to write during their undergraduate studies, or that they "had to know how to write" to get in to grad school. Yet I've had numerous professors tell me that writing skills are a major problem even at the graduate level (where a whole new level of writing is required).

I was recently helping a friend, who is an M.Ed student and a good writer, to prepare a grant application--and I noticed that his draft had been re-written by one of his profs (rather than merely edited). I could tell from the language she'd used, compared to previous drafts he'd written; and because the language had changed, so had the project--into something he hadn't really "framed" himself.

As we went over this new, re-written draft, I helped him to replace language that seemed inappropriate by asking about the ideas behind, and impressions conveyed by, the words; we also "broke up" the seemingly polished structure of the writing by cutting, pasting, rearranging, and adding in points with no concern for cosmetic editing. We pulled out the issues that seemed to be central and made a list, starting over with a new structure and concentrating on telling a coherent "story" about the project.

It felt as if the real focus kept getting lost in all the ideas that were floating around--that was half the problem. But the real trouble for my friend was even more basic--he had been told to write something in a completely new genre, and offered almost no guidance. With many thousands of dollars' worth of grant money at stake (the Ontario Graduate Scholarship is worth $15,000 for a year, and Tri-Council grants offer more), writing had suddenly taken on a new and immediate importance, and there was little appropriate help to be found from professors swamped by similarly panicked grad students (a good number of whom have never heard of a "research grant" before their first year of PhD).

In the end it wasn't due to my teaching skills that we ended up making progress (if we did)--far from it, I'd never done this kind of work in my life and I had to think: how does one write? How do I write? After all, I was pretty much the only model I had to go on. I had never really thought about that uncomfortable process outside of trying to enact it somehow, as contradictory as it sounds. My friends don't usually discuss how they write, though they frequently bemoan the difficulty of it. I'd helped students with writing before, but there had never been time or space for such in-depth consideration. So the struggle for me was one of translation and negotiation, and fortunately what I did have was some experience with producing grant proposals.

This only made me think more about my own, current editing tasks--my dissertation writing and the papers I'd like to see published, in particular. I recently was forced to consider how much my process must have changed over time, when I was revising a paper written during one of my MA courses. The paper lacked the structure I would have given it if I had written it more recently--indeed, I'm currently re-ordering the entire thing such that the reader isn't expected to plough through the textual equivalent of an army obstacle course. My more recent writing is evidently more well-planned, as the other papers showed, but work from just 18 months ago still seems littered with tentative statements and unnecessary words, begging for a linguistic pruning.

And yet I can't remember ever having been told anything about these things--ever really learning them--other than perhaps by osmosis. This gives me some faith in the concept of a kind of gradual improvement with time and practice; but I still think it's the self-reflexive process of working with other people that brings real perspective and the motivation to actually consider one's habits and tendencies in more depth, with an eye to doing better (writing) work, and to working better overall.

Tuesday, September 21, 2010

The Proof of the Pudding

Throughout the first few weeks of September, we've seen a number of reports released, both in the U.S. and Canada, discussing and describing (quantitatively) the positive outcomes that students generate from obtaining university credentials. These reports have appeared at roughly the same time as the international university "rankings", which were unleashed around the middle of the month--along with OECD education indicators and Statistics Canada reports on tuition fees and national education.

The strategy here seems straightforward enough; after all, at the beginning of the school year, it's not primarily students but rather their parents--in many cases--who are concerned about whether the college or university experience is going to be "worth the investment". (I would argue that the parents should also look to their own departing children if they want to know the answer to that question-!) It's a great time to capture an audience for the debate, since students beginning their last year of high school at this time (most of them still living at home) will also be searching for relevant information about possible PSE options.

These articles are reports stir up the debate about public vs. private funding of PSE, about the rising proportion of university revenue generated by tuition from students and families, and the cost to the state of educational expansion. They also pitch university education primarily in terms of its economic value--not only to individuals, but also to the state (since educated people are "human capital"). Education correlates with increased income over one's lifetime, with better health (saving taxpayer dollars), and with inter-generational class mobility. These arguments, along with those citing tough times for the government purse, are frequently used to support a pro-tuition-increase position both in the media and in policy debates.

All these points may seem valid enough until we consider the fact that while students may all technically pay the same amount in tuition (say, at a given university or in a particular program), they don't all receive the same "product". And universities generally advertise to them as if the same product is really on offer to everyone. Which it certainly isn't--the costs alone (which exceed tuition) are borne in entirely different ways by different students, a point briefly raised by Charles Miller as quoted in this article. If my parents pay for my tuition and living expenses, then what costs am I absorbing over the period of a 4-year undergraduate degree? How does this compare to a situation without parental support? Low-income students are less likely to have family help and more likely to take on a large debt burden; they are less likely to have savings accounts and personal investments, less likely to be able to purchase cars and condos when their student days are done.

Aside from the variation in economic circumstance, students also bring differences in academic ability and social and cultural capital to their degrees, which means that development differs for each person and so does their overall capacity for career-building.

Not only does university have different "costs" for different people; it also has highly variable outcomes. Some students will land solid jobs and find themselves upwardly mobile after completing a bachelor's degree. Others may continue to a Master's or even a PhD and discover that gainful employment impossible to find, for a variety of reasons. There's also the question of whether students obtain jobs in their chosen fields--or within a particular income range, for that matter. And once they do find employment, earnings differences by gender (for example) still persist to the extent that women in Canada still earn significantly less than what male employees take home for equivalent work.

Another form of quantitative justification, the rankings game is an attempt to make the intangible--the "quality" of education, or of the institution--into a measurable, manipulable object. Part of the yearly ritual is the predictable squabble over methodology, which generates much commentary and debate, particularly from those institutions that have found themselves dropping in the international league tables. This quibbling seems ironic given that all the rankings are embedded in the same general global system of numeric calculation, one that feeds international competition and now constitutes and entire industry that rides on the backs of already overburdened and under-funded university systems. While the public may rail against the supposed over-compensation of tenured professors (salaries represent the universities' biggest cost), institutions continue to engage in the international numbers game, pumping money into the yearly production of "free" data that are then made inaccessible by the ranking organizations (who profit from their use).

Education reports, with their quantitative indicators of the economics "benefits" of higher education, are a part of the same overall tendency to assess, to compare, to normalize and standardize. Earnings-related numbers often provide rhetorical support for policy agendas that involve higher tuition fees, since proving the "private" benefits of education means that we can charge the user or "consumer" of education for access to these (eventual) benefits.

Rankings and statistics serve as a means of informing risk assessment--for governments, when funding is increasingly based on "performance", and for students, when it's about choosing the "better" university. But no numbers can truly gauge or alter the inherent risk of education and knowledge, the ineffability of the paths we take to discovery, the serendipities of fortune and temperament that can lead one person to the gutter while another may hit the heights of achievement. Students have moments of inspiration, they meet undistinguished professors who never publish but turn lives around. They form unexpected friendships and stumble on opportunities, skewer themselves on pitfalls both obvious and unseen.

In other words we cannot ac/count for this most joyful and painful side of our educative experience--the unknown element which is frequently the most formative one; and the more we attempt to inject certainty into this process, the more we set ourselves up for disappointment. This doesn't mean there's no use for numbers, for evaluations and assessments, for attempts to improve our universities. But sensible decision-making, whether by students or by governments, will always involve more than a measurement.

Monday, September 13, 2010

Interesting Critique...

...of the latest clutch of Higher Ed books to drop into the market. The "Failed University" is becoming the topic-du-jour for those looking for a fresh target on which to pin national and economic failure.

An interesting point here is the continuing segregation of colleges and universities into the "haves" and the "have-nots", with the student populations at these schools reflecting this divide in terms of their socio-economic status (for example). Underprivileged students are coming in to an "accessible" system where they find that not all "access" is equal, particularly at for-profit institutions that charge high tuition and enable students to rack up many thousands of dollars in debt (as we have seen recently in the U.S.).

This is an important point--and I agree with it. However, I think another important thing to remember is that "higher education", particularly the university, was an elite institution--more or less--for its entire history up until about 50 to 60 years ago. And structurally this is still the case today.

In Canada, this meant small institutions with religious affiliations, where funding came from student tuition and private donations. Not until the post-WWII period, with the Veterans' Rehabilitation Act, did Canadians see an accessibility initiative anything like what we have in place today; and in the period from the end of the 50s to the beginning of the 70s, enrolments tripled.

Have we really created planned/considered structural solutions that reflect these significant changes to enrolment, and the drive to "accessibility"? Or have we merely tried to extend the old, elite model to more people--negating its past function, without acknowledgment that we've done so? What will we substitute for this model--and why, after 50 years of increasing massification and its deep consequences, are we still asking?

We know there has been a change in signification--a Bachelor's degree simply doesn't "mean" what it used to in the past. Some people even talk about graduate market glut (another jobs/skills mismatch?), even as others build arguments about why a university degree is "still worth it". But as I argued in my previous posts about tenure, I'm not sure we're really acknowledging the extent of the changes that have occurred--or indeed the ways in which higher ed is repeating mistakes made in the past with the primary and secondary education systems. "Worth it for whom?" is only the first and most obvious question.

The critique of socio-economic class reproduction--one so frequently levelled at national, public school systems--is now being targeted at universities and colleges. I think we need to ask: why are the problems persisting and the same critiques being offered? Why have we not solved this problem in primary and secondary education? If we haven't "fixed" the first 12 years of education--can we expect to manage the postsecondary problem successfully? And is is really the best approach to simply attack the existing system--as we are seeing now with universities?

Monday, September 6, 2010

Decisions, decisions. Part 2: Tenure and what else?

As I discussed in my last post, the "vanishing tenure" problem is partly a simple matter of numbers, but it is also something more. There are now (not coincidentally) many, many more graduate students than there ever were in the past--both in terms of gross enrolments and also by proportion. In Ontario this is by design, as is evident from recent government policy. But does the government intent to expand graduate programs in order to create more tenured professors? No. Their primary goal is to develop self-sustaining "human capital" and to boost the provincial (and ultimately, national) capacity for constructing a competitive "knowledge economy".

So according to that logic, most of us should be looking to build careers in other, "knowledge-intensive" fields. But how many of us currently in grad school (especially on the PhD track) know what those fields are, and how to access them? Can professors (our supervisors) help or not? How can we find appropriate mentorship for this kind of transition? What is this alternate path we're expected to take, and where does it lead? Was this what we were encouraged to expect when we applied to graduate school?

Here we hit upon a cultural snag that is not being addressed by government policy: in many PhD programs, there is a perpetual assumption (or implication) made that non-academic jobs are inherently less desirable and somehow not "pure" or good, since in the academic system, designed to replicate itself, graduate education has historically been a process of "socialisation" to the professoriate. This ethic is still being inculcated in graduate school, and it's one that goes directly against the exhortations of government policymakers and professional pundits alike. This is why there are so many articles and blog posts dedicated to the subject of "escaping" academe, and why graduate school has been characterised as a "ponzi scheme" and even a cult.

As I mentioned in my last post, this socialisation/enculturation model worked well in the past, when very few students went on to complete PhDs and then filled the professorial positions available. But it is directly at odds with the form of systemic expansion we're now experiencing. In another previous post I discussed a breakdown of graduate mentorship; now not only are mentors becoming scarce, they may not possess the knowledge, social capital, or indeed even the motivation to help graduate students find non-academic work. What's worse is that after years of graduate study, many students remain in denial even when faced with the reality of the academic job market.

For current graduate students, I think the important question to ask in the face of all this is not "why did you really go to graduate school?" but more fundamentally, "will you make a decision about why you're there?" rather than continuing to assume that your PhD will (and should) lead to a job as a tenured professor. In suggesting these kinds of questions, I don't mean to imply that we should take an entirely instrumental view of graduate education or discount the joy of serendipity. But we do need to learn to think twice before counting on that desirable academic position waiting somewhere down the line (or thinking that once we obtain such a position everything will be fine).

And this isn't a negative thing. We do have options: the choice is not between "tenure-track professordom" and "failure". The choice is not between an endless cycle of job applications and contract positions while waiting for that elusive permanent academic position to appear--and "giving up"; it is not a choice between intellectual martyrdom and "selling out". And while the question of "alternative" careers is addressed more or less and differently across disciplines and programs, there is still a strong culture of replication in PhD education, one that is bolstered by increased competition for scarce resources.

As graduate students or prospective grad students we need to think about why we're being encouraged to go to graduate school and what will become of our lives because of it. I don't believe that we should accept the sacrifice of balanced and healthy lives in order to realise the Academic Dream. Nor should we feel that achieving this Dream is the only form of sanctioned success.

Among those who have made the decision to follow the academic trajectory, there will have to be more consideration and awareness (in all disciplines) of the fact that while the traditional tenure arrangement worked in the past, the current system--stressed with undergraduate and now graduate expansion, limping by with proportionally less government funding than ever, and increasingly reliant on exploited contingent faculty and rising tuition fees--cannot be what it was even 50 years ago, and what it is in so many people's minds still.

This is not a matter of ideological positioning, but one of recognition: universities have changed, for good or ill. But while we face certain contextual realities, our actions in the present and our choices for the future will reflect principals and values, and it's those choices to which we now need to look, and to those principles we'll have to rally.

Our systems can no longer afford to bear those who in the past sought tenure for its security and financial rewards--nor those who seek to contain their knowledge within the mythical Ivory Tower. In my opinion we need to resist the purely bottom-line oriented, economic model of governance that frequently predominates, the one that treats knowledge as an object and education as a commodity; but resistance will be a matter of principle as well. And in order to have other, better options we'll need to be ready to participate and collaborate, to help think of new solutions for sustaining this oldest of institutions, to contribute to its re-invigoration with all that our fertile brains have to offer.

The inculcative ethos of the academic PhD sets up the question--should we "abandon" the academy, or is it more ethical to tough it out and fight for the old ways? I think the answer to these questions is both yes and no. Tenure as we know it is not the solution to the need for more teachers at universities. But neither is the exploitation of thousands of young (potential) scholars who have the desire to build fully-rounded academic careers. On the other hand, the features of tenure--academic freedom and job security, fostering long-term commitment to the institution and to students--still have a definite purpose and should be incorporated into/cultivated by whatever model we create. Academic freedom is now more important than ever and still under threat, as some recent cases in the United States show.

A related point: just as the academic career shouldn't be a sacrifice, teaching shouldn't have to be a labour of love. We need to come up with a way to change the distribution of work in universities such that those who are happy to teach and good at it are offered long-term stability and rewards , just as tenured, research-oriented faculty are now. And we should strive to allow for more movement between academic work and other kinds of engagement and research, with recognition of that "other" activity in the promotions process. These kinds of changes will help to overcome the problems with inequity and faculty diversity, as well as opening up more options for students, allowing them to develop the necessary social capital to move to positions outside the university. This could also help to dispel the misconceptions and negative stereotypes that abound in public discourse about university education and professors specifically.

And of course, all this will entail a different understanding and practice of graduate education, one that can encompass preparation for academic careers but also for other applications of graduate-level skills and expertise.

I've been lucky to have a lot of good guidance on my own journey. I have role models who work or have worked both within academe and outside it (often simultaneously), so I have something to look to when it comes to "imagining" a different kind of career or even a different "way of being" as a professor. These people have helped me to acquire the explicit and tacit knowledge I needed to understand and participate in academic life, and they've provided invaluable support and encouragement.

But they've also taught me to consider other possibilities, to think reasonably about my goals and how best to achieve them. Now I'm asking not only "is there a tenure-track job for me?" but also "would I do a really good job as a professor? Would I be happy?". For me this is important, partly because I want a mantra of feet-on-the-ground guidance in my attempt navigate the murky bog of dissertation-writing, "professional development", fellowship applications and the post-grad-school job search. I'm hoping the combination of keeping informed, building social capital and cultivating self-awareness will be enough to keep me afloat through all this chaos. I've learned to plan and prepare, and to make decisions in stages.

Perhaps, after all, these are the skills we should cultivate in our graduate programs: self-knowledge, adaptability, independence, creativity, and the ability to question our own assumptions, as well as the resilience to deal with the outcomes of that questioning.

Sunday, September 5, 2010

Decisions, decisions, Part 1: What's in store?

Almost every day I take time to read the higher education (PSE) news from Canada and around the world. And every day a cluster of common (and inter-related) themes tends to dominate the articles and blogs.

One of those themes is: How many (or how few) tenure-track jobs are there available for new PhDs in various fields? Can we give tenure to "adjunct" (contract) faculty whose working conditions are insecure? Given the lack of tenure-track hiring, should we be encouraging and preparing grad students for careers outside academe? And inevitably the questions arise--should we retain the tenure system in universities? Can we keep it, and if so, how and why? What purpose does it serve, and for whom?

I'm going to try not to repeat too much what others have already said, since the discussion has been a regular one over some time and many of you have been following it with interest. What I write here is profoundly influenced not only by what I "study" (post-secondary education) but also by who I am, since the question of tenured academic employment is more than merely theoretical for me--it's about actual life choices I need to make in the immediate future. My personal perspective is that of a PhD student who will need to decide, within the next couple of years, about either focussing on an academic track or looking for work outside the PSE system (and possibly returning to it later in my "career"--if I'm lucky).

I feel deeply conflicted about this issue. On the one hand, I love the "ideal" of the academic life: I love teaching and would like to be able to do research of my own (and even write the book I have planned). I was drawn into grad school because I loved the conversation, the learning, the sharing and development of knowledge and ideas that occurs when academe is at its best. And I like participating in the continuance of the university itself, in decision-making within the institution.

But then again, close observation of the academic environment over the course of about 7 years has led me to doubt the reality of the "life of the mind", to question its continued existence in its (past and) current form, and to think through the privilege that is necessary merely to have access to such a life, let alone to live it through the university. I feel more trepidation and doubt now that I did at the end of my BA. What kind of career might be possible for someone like me in the increasingly competitive environment of the university--and would I want it?

I do love teaching but I frequently feel frustrated by the context of teaching, wherein I've often felt stressed and compromised and have seen many others in the same state. Universities have continued to expand during the last 30 years in spite of relative declines in funding; the growth in undergraduate numbers has meant an increase to the amount of teaching work, and this task has been transferred to inexpensive contract faculty rather than to new tenure-track hires. Universities are now dependent on such faculty, and on inexperienced graduate students, to carry out undergraduate teaching at budget rates--in spite of the potential for negative effects on the learning environment.

Even as the need for teachers has increased, research and publishing are still the main means to reaching desirable tenure-track jobs. For those unable to score such a position immediately after the PhD or post-doc fellowship, the "hamster wheel" of contract teaching can take up all the time that might have been put towards writing. Gender also matters: not only is teaching itself feminised, but as a female entering my 30s I will face difficult choices about family and career--choices that often put women at a disadvantage in the university workplace, wherein we already earn less on average than male scholars. Contingent faculty also have much less input--if any at all--into the way the university is run, so they are shut out of decision-making processes that affect them.

The question of "tenure or no tenure, academic work or not" is not only about choice of jobs. Academic training involves 10 or more years of post-secondary education, which can mean stalling the supposed milestones of adult life (buying a house and/or car, having children, building a long-term retirement plan and so on) until your late 20s or early 30s--unless you had a healthy amount of economic privilege to begin with. This is a significant investment of time, money, and other resources. If you've managed to accumulate a mound of student debt during your time in university, then you'll also be trying to find ways to juggle that with your regular living costs. In other words, you'll want a steady, reasonable income, not the tenuousness of contract-to-contract teaching work.

The lack-of-tenured-employment problem is not just a short term one, a "dip in the market". On the contrary, it is bound up with the structural changes associated with massification that have occurred in universities over the course of the last 60 years or so. For a while, the potential problems were allayed simply by injecting more public funding into the system (from the 1960s to 1970s), and hiring more full-time professors, as a means of increasing accessibility for previously excluded groups. But the recessions of the 70s, followed by 1980s neo-conservatism and (here in Ontario) the Harris Conservatives in the 90s, have made fiscal instability the norm. Hence contract faculty also serve as conveniently expendable labour when budgets shrink.

The future of tenure as a system is shaky, primarily because of these structural issues. As our PSE systems are stretched to their limits, old ways of doing things have come under attack not only by those marginalised by the existing, unequal tenure system but also by increasingly influential "stakeholders" outside the university. Tenure was a system that functioned reasonably well when universities were elite institutions with few undergraduates and even fewer graduate students, but in Canada at least, the beginning of the end of that arrangement came in the 1960s. And it's somewhat ironic that while universities have become more "accessible", tenure is now becoming much less so.

Even as contract faculty form associations to lobby for their rights, we see regular stories from the United States and elsewhere about PSE institutions making it easier for themselves to dismiss tenured faculty as well. So changes to tenure are already becoming an issue that affects everyone, one that needs to be resolved fairly and sustainably and in the near future. If we don't come up with a more equitable solution by design, then the situation is likely to degenerate along the current well-beaten track--with persistent inequalities between a small, elite group of well-paid research professors (and increasingly, administrators), and the non-permanent faculty who pick up the expanding teaching duties necessitated by mass post-secondary education.

None of this looks to me like the kind of situation on which I want to stake my own career and livelihood. And I think the "rational" decision would be to choose some other field. But my love of learning--and of helping others learn--is not necessarily rational, though I do have a healthy desire to see things change for the better and to put my own energy toward that goal. As always I'm walking a line between intuition and "reason", frustration and elation, helplessness and empowerment, and looking for some happy middle ground on which to build a launching pad, a castle, a jungle gym, whatever seems necessary. Of course that must be done whilst successfully navigating the way through the PhD process, but I'll get to that in my next blog post.

--------------------------

Coming up soon, in Part 2: Why do so many of us want to be professors? The culture of graduate school, changing needs of grad students, the uses of tenure and a few ideas about (positive) future prospects.

Wednesday, September 1, 2010

Moving

I spent most of last week packing up my entire apartment, then shifting it all back to Hamilton (with the help of long-suffering friends). I'm hoping to get back to blogging shortly, now that I have home internet running and the chaos is beginning to dissipate...! Anon.

Sunday, August 15, 2010

The basics: About this blog, and me.

Since this is a new blog, I thought I would start out by describing it and myself. I think setting a kind of 'tone' in the first post might help me to frame the rest of what I write here.

A bit about me, to set the context: I'm a graduate student, an international transplant (from New Zealand to Canada), and I've lived in four Canadian cities, in three different provinces. My degrees are in three 'disciplinary' areas (Communication Studies, Linguistics, Education), and my education history is a long and somewhat messy one that I won't recount here. I've worked variously as a fast-food jockey, an ESOL teacher, a scrap-yard cataloguer, a dishwasher, a researcher, a graphic design assistant, and a census-taker. I've also participated both as volunteer and paid worker in quite a few elections (aided in campaigns, worked on election information distribution, registered voters, acted as scrutineer, assisted Returning Officer, etc.)--since well before I was allowed to vote.

These days I'm working on a PhD and I have a teaching assistant position each year. Though I haven't yet taught a whole course myself, I've taught more than many PhD students at my level, since I started my first TA position when I was still an undergraduate. My grad school teaching experiences have also been a bit more diverse than the norm--I've been fortunate enough to work with different courses every year (sometimes in different departments), including in the teacher training program run by my "home" faculty. Working with teacher candidates is rewarding because of the very real and direct challenges they face in their own classrooms during practicum, which foreshadow what is to come later in professional life. The discussions have a kind of relevance and immediacy to them that can seem absent with first-year undergraduates. Still, it's a different experience working with the younger students, and it's rewarding in other ways.

There are probably a number of things about the way I try to work--as a writer/researcher, as a teacher--that will bleed through to this blog, just as the questioning I bring to my research also comes into the classroom when I'm talking with students. Teaching, and reading, have taught me that I have broad interests and a pretty tangential way of thinking; so I've learned to keep relating things back to a theme or to some common question. And one way in which I like do that is to return to the discussion and definition of key terms and basic concepts. In tutorials I've tried to emphasise asking fundamental questions, the answers to which often seem “obvious” but which tend to help demonstrate how the "easiest" question can turn out to be the toughest one to answer.

This kind of questioning is more than just an exercise devised to provide fodder for course grades. My feeling is that if underlying, often apparently "only" philosophical, issues are not debated and fleshed out, then the overall direction of teaching and learning and also of our theorising and policy-making, will be uncertain and/or skewed. There will be a lack of solidity to the proposals, and no cohesion around the principles. We want a "knowledge economy", but we don't know (or bother to define) what "knowledge" is. We talk about increasing a nation's economic worth by raising the number of post-secondary graduates it "produces"--but we don't question the reduction of civic participation to numbers of degrees earned. We want children to "learn", but only if this learning shows up in the results of standardised tests. We demand “evidence” of progress, efficiency, and effectiveness, without wondering about what it is that we allow to “count” as proof, and who defines it.

When these fundamental concepts are left un-discussed and undefined—as well as un-critiqued—outside of scholarly journals, the scaffold of common understanding on which politics and policy should be built becomes biased and superficial, and is weakened at its base.

Being critical is about more than just talking about what’s wrong; eventually we must be able to propose solutions as well. This will mean learning how to work with people whose views we may not share. For this reason I think we also need to cultivate an environment of mutual respect for discussion.

That's a difficult task. I know that naturally, there will always be some perspectives with which I agree more than others (and some that to me are just egregious). For example I tend to be anti-marketisation, because looking at the effects of that particular trend in governance is part of my academic work--and I haven't seen much evidence of its 'success' (depending of course on how you define that term). But that doesn't mean I won't try to have a reasonable debate with someone who is strongly pro-market; without that kind of debate, we can't solve policy problems and we certainly can't delve deeper into the core issues that drive governance decisions. And it doesn't mean that I'll argue blindly for some other viewpoint, since I'm still unsure about what the "best" answers are to the challenges of our current context. I have no problem admitting that. But I try to resist the binary options that are so often placed at the heart of political debates and the programs they support or attack.

That's partly why I'm doing a PhD--not so I can bolster my pre-formed opinions with a credential, but so that I can try to understand the situation (in its often stultifying complexity) and make a contribution to improving it. Perhaps it sounds ambitious, but surely it's a project in which we all have a role to play, as educators, students, intelligent and informed commentators, and engaged members of society. Surely we all have a stake in whatever 'solutions' are chosen, which makes the quality of debate all the more important. If we can't talk across--and beyond--our differences, then small, ideologically-driven factions will be more likely to gain influence over government and policy-making, something Canadians are already starting to recognise.

Learning how to understand others’ points of view, working to negotiate reasonable compromises across differences, thinking critically about language and information, allowing multiple forms of evidence and experience to inform our conceptualisations—these are all broad skills that will be necessary for us to cultivate if we are to resolve the great social, political and economic dilemmas that will confront us in the coming decades. Education will play a role, but that role will depend on whose idea of “education” is prioritised and mobilised through policy and governance.

I suppose talking about the context of what I write here has ended up leading to a rather lengthy ramble on the importance of education for the future of our species—a predictable message after all! I can only hope I’ve at least couched it in interesting terms.