On the Need for a New Rhetoric: Part I — Setting the Stage

In 1958, Stephen Toulmin published The Uses of Argument, which espouses a method of argumentation that is one of the current foundations for teaching rhetoric and composition at many universities. In brief, it is a system for building and refuting arguments that takes into account both the claim, and the evidence required to support it, and the assumptions that claim is based upon. Its versatility and effectiveness make it a natural choice for teaching first year composition students how to make the jump from high school to university-level writing. 

As good as this system is, it does not address the largest change in writing in English since 1476 — the year William Caxton introduced England to his printing press and publishing house. Although it is almost impossible for us to conceive in 2017, his editions of Geoffrey Chaucer’s Canterbury Tales and Sir Thomas Mallory’s Le Morte d’Arthur were fast-paced, next generation texts that relied on an advanced and inherently democratizing technology. No more was one limited to the speed of a scribe. Leaves could be printed off in multiple copies rather than laboriously copied by hand.

Over the following five or so centuries, this change in the means of production has had an undeniable impact on the act composition. Dialect[1], punctuation[2], and spelling[3] standardized. Writers began to write for the eye as well as — and then in stead of — the ear as reading moved from a shared experience to a private one.

The world that the printed text created was the world within which Toulmin formulated his method of argumentation. While it used the work of Aristotle and Cicero, it no longer situated itself with an oral and aural world. A world where a reader can stop and reread an argument comes with a different set of requirements than one where an orator must employ repetition to make certain that an audience grasps the point. By way of example, the change in tone associated with Mark Anthony’s pronouncement that “Brutus is an honorable man” is more powerful when heard from the stage (or screen) rather than seen on the page (or screen) of Act 3, Scene Two of Julius Caesar.[4]

There have been several innovations and improvements along the way. But the addition of pictures for readers and typewriters for writers did not inherently change things. Yes, the material produced by a writer would looks like its final form sooner in the process but the process did not change. The limits of experimentation were bound by the limits of the codex[5] format.

With the development of the computer and modern word-processing, however, we are experiencing a change every bit as significant as the one wrought by Caxton — one that will once more change the way we compose. Indeed, it already has changed the way we compose even though we do not all recognize it. What is currently lagging is a new methodology for composition. 

Before I tell you what to expect in the next post, let me tell you what not to expect. I will not be discussing the shortening of attention spans or the evils that screen time has wrought upon our eyes and minds. In truth, many of those arguments have been made before. Novels, in particular, were seen as social ills that harbored potential for weakening understanding. In fact, the tradition of railing against new methods dates back to the invention of writing. When Thoth came to the Egyptian gods to tell them of his revolutionary idea — an idea that would free humans to transmit their thoughts from one to another in space and in time, the other gods objected, noting that writing would come to destroy human memory — much in the way people now mourn their inability to remember phone numbers now that our smart phones remember them for us.

What I will be positing is a materialist argument: That writing on a computer/mobile device screen has significantly changed the model that we use for creating arguments and composing the form they take because writers have moved from an architectural model of production to an agricultural form of production.


For those non-English majors reading, the Middle English spoken by Chaucer, a Londoner, was noticeably different from the Middle English of the English Midlands spoken by the Pearl Poet, the unnamed author of Pearl and Gawain and the Green Knight

  1. The manuscript of Beowulf, for example, is one long string of unpunctuated text. It was assumed a highly educated reader would know where one word ended and the next began.
  2. William Shakespeare, famously, signed his name with different spellings at different times. This was not a sign of a poor education. It is a sign of a period before any dictionary recorded standard English spelling.
  3. It is worth noting that this example carries within it another example of the change being discussed. William Shakespeare did not write for a reading audience. He wrote to be heard and changes in pronunciation have obscured jokes and doubled meanings.
  4. Given what will come later, it is worth beginning to use the technical term for the physical form that books take — leaves of paper within two protective covers.

Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

Instagrammar

For a good while now, I have been trying — and failing — to get the hang of Instagram. I hasten to add that my ability to wrap my head around Instagram equally applies to Snapchat and any other photo-based social network. The issue isn’t the interface or the filters or controls or anything else about the usability of the app. 

I don’t speak the language of Instagram but I am learning to understand it.

I consider my inability to speak this language a problem. 

In one of the unrecorded discussions of our most recent Mellon funded Summer Institute, Fraser Speirs pointed out that if there was a theme to one of our day’s (or days’) discussions, it had to have been Instagram and how our students had begun to gravitate towards it as a communication platform and social space — one that we should understand and learn how to use to reach them and, when appropriate, to bring into our classrooms. 

I had looked at Instagram (and Snapchat) in the past and knew several people (including the always au fait Jemayne King — check out his “Meme You No Harm” talk) who had made the jump to photo-based platforms. So I dusted off my account[1] and tried once more to navigate the platform. 

I still don’t get it. That said, I have learned a lot from those who do. In my attempt to “get it”, I selected people to follow from among former students, some current colleagues, and some artists (mostly dancers, photographers — both professionals and serious hobbyists[2], and sculptors) of my acquaintance. I also added in some organizations I respected highly, including NASA

I cannot say that there is a one-to-one correspondence between artists and high-quality Instagram accounts but the odds will ever be in your favor. Their eye has been trained to follow the phenomenal world and either capture compelling images or know when one has been captured.

What I found particularly interesting was what NASA was doing with Instagram Stories. I had initially followed them for their collection of beautiful space imagery. They let me go, at least imaginatively, where no one has gone before. Those interested in the application of technology in the classroom, NASA’s stories are worth considering carefully.

For those unfamiliar with stories, they are a series of still images and videos that a user links together. In theory, they tell some sort of story — even if it is as simple as “Look at how my day went.” Those teaching narrative in Creative Writing and Photography classes should take note of this. It is a way of building narratives within a means of distribution.

NASA is using these stories to create self-continued mini-presentations on their missions or a space science topic. They are well put together but, as is appropriate to the medium, they are rough cut rather than highly polished artifacts involving advanced post-production. 

They are a model for the rest of us. After all, these are people explaining rocket science to those that aren’t rocket scientists. It is the kind of thing that any academic could use to illustrate self-contained, foundational items from their field — the elements that we so often despair that our students don’t come to us having learned. An English Professor, for example, might put together a series of stories that explain the parts of poetry. One story could focus on the iamb followed by another on the trochee. Or, perhaps, a series that explains the comma and how it is used.

The stories may be ephemeral, but there is no reason that they could not be re-released as reruns by a sufficiently enterprising individual.

Now, if someone could put together one covering the language of Instagram, I would appreciate it.

 


  1. Those of you who are curious should read the privacy policies and understand who owns what before jumping into these sites. Those who have developed an antipathy towards Facebook, for example, should understand that Facebook owns Instagram. For those who have bought in to the Facebook platform, however, this will come as great news because the two services integrate easily.
         I don’t mention this to be an alarmist but because users should remember that free services are paid for somehow and that others might try to unscrupulously profit from their work and/or family photos. Look at your privacy settings and make sure they are set at a level that you are comfortable with. Also, be sure to understand that the deal you have made about the pictures you upload could change as companies are bought and sold.
     
  2. I chose Andy Ihnatko and Scott Bourne’s accounts to use as examples here because not only are their accounts public, they regularly refer people to them on podcasts. There are many others whose names (e.g., https://www.instagram.com/jbphotography2016/) could easily appear here.

Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

An Always Engaged Audience

Some background: For those in other parts of the world who may not know, this fall has been unusually rough when it comes to the bugs that have been floating around here in North Carolina. Like most parents, my wife and I have a working arrangement — subject to change based on the needs of the day — as to who will stay home with our daughter when she has to stay home from school. Since we are both professors, this pattern generally aligns with the Monday-Wednesday-Friday (MWF)/Tuesday-Thursday (TR) split. 

This semester, I have the MWF shift. This is counterintuitive, as my classes this semester are on MWF. She, however, has classes with more in-class assignments on MWF than I do. As a result, my MWF schedule is more open to alternate approaches.

I have been using Periscope to stream classes when I stay home. It is an imperfect vehicle for what I am attempting[1], but it gets the job done. 

In a strange way, I have noticed that I find it a surprisingly comfortable experience to hold class via Periscope. And after class today, I think I have settled on why this is. 

For those of you who have never taught, facing a room full of students can be a depressing task. I know that my students are more engaged than they look. Their questions and comments have, on more than one occasion, proved that just moments before I was about to succumb to despair. But if you know the semi-blank look that people assume when they watch television, you know what you will see looking out at a room full of students. Not all of them look like this, of course. Some are more animated and some are less. Nevertheless, there is a passive look that pervades the room. This can be true with the most engaged of students. If one is taking notes, for example, you do not get to see the animation in their face because they are looking down.[2]

When you broadcast on Periscope, you look at yourself. It is a feature that lets the broadcaster know what his audience is seeing. So, when I am talking about Mark Twain and H. G. Wells,[3] I am looking at someone who is actively engaged — not a classroom of students who are paying attention and trying to process what is being presented or discussed.[4] 

I know that when I present, I feed off of those who are actively engaged. Most people in front of an audience do. When that is happening, I feel like I am doing a better job (Whether I am or not is a different question.). With Periscope, I provide myself with a positive feedback loop.

As a result, classroom performance, in the literal sense of the nature of what is presented rather than its content, could (Let me stress: could.) improve on the in-room experience with access to this technology, if it can be successfully linked to a mechanism for student participation, as discussed in footnote one below. It might also be worth considering and weighing for those running experiments with classroom delivery, as can be seen at Minerva University or through on-demand services like Kahn Academy.  


[1] If you want to see what can be done with a streamed class that functions very much as an interactive seminar, I would highly recommend that you tune in to one of Signum University’s open classes. You will find me sitting in on “Exploring Lord of the Rings” most Tuesday evenings, beginning at roughly 9:30 PM Eastern. Professor Corey Olsen simultaneously broadcasts via Periscope/Twitter, Twitch, and Discord while being “present” in Lord of the Rings Online. The online version of Middle Earth allows for the classes to take field trips to locations of note every week. (The broadcasts are then made available on YouTube, as can be seen in this randomly chosen example.) He manages to juggle three chat areas (Discord is the primary location for the comments and questions.), where people ask questions and offer comments.

Since this blog is about the iPad in the educational space, I will let you know how I attend. I run  Twitch (which contains the video and audio stream I use) and Discord (where I am present in the chat) in split screen mode on my 10.5” iPad Pro. I find it quicker to type my comments on the Smart Keyboard but often use the onscreen keyboard. To get a fuller picture and sound, I AirPlay the Twitch stream to my Apple TV. That, however, is a creature comfort and allows me to avoid resizing the split screen view to see more of the slides presented on the screen and/or type. It is possible to have a decent experience doing it all on the iPad.

[2] In case you are a student and are wondering, your professors can tell when you are writing about their class rather than another. The rhythm of your engagement in the class and the engagement with the page are either aligned or are not.

[3] Today’s 9AM class, which is on the way people understand time and how that is expressed in art and society, wrapped up A Connecticut Yankee in King Arthur’s Court and started The Time Machine.

[4] For those who might want to offer a well-meaning critique here and talk about active learning, let me offer a quick distinction: The look on a student’s face is a function of what they are doing rather than the pedagogical structure they are existing within. The same look will pervade on the faces of students during the kind of discussion or activity you might suggest I try. There is a material difference in the look worn by a student when they are “on” — when they have the floor or are talking — and when they are not, whether they are in a lecture or working in a small group. As a practical matter, it is impossible for everyone to be fully active at once. It is a question of how often they are fully active, how often they are partially active, how often they are passively active, and how often they are disengaged.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

140 Characters of Outrage or 140 Character Koans

Andy Ihnatko’s recent comment about the release of the iPhone X is a great example in the potential held within a Tweet. 

IMG_0003.PNG

There is little to disagree with here. Everyone would and should have their soul’s dignity respected and held sacrosanct, whether it is by others or by ourselves. But his statement has an edge to it — one that cut many, myself included. Based on the 126 and counting responses to the Tweet and/or replies, I was not alone.

Yes, I felt lucky that I got an early date on my upgrade to the iPhone X (Being sick, I slept through the 3 AM Eastern launch time.). But I have had to wait for technology upgrades before and did so with a shrug. I considered responding to the tweet in order to make the distinction between the sense of feeling lucky to get an early date and the kind of toxic need to be a first-day adopter. 

Then, I saw the outrage from those who took great umbrage to a statement that safely true.

I won’t get into the details of the responses (You can read them for yourself, if you wish.) as I don’t feel comfortable speaking for others in regard to what they felt and why they felt it. Their outrage, however, triggered a need in me to confront my own response. While I don’t think my relationship with technology is toxic (I can stop any time. Really.), my need to justify myself to someone who does not know me from Adam certainly signals something — something more than acknowledging the false intimacy that social media can sometimes breed. And while my wife’s occasional jokes about my relationship to my iDevices being a little to close to Gollum’s to the One Ring might be overstating it a bit, the cutting edge is still there. 

Ultimately, I do think that the technological revolution offered by these devices is more democratizing and liberating than they are binding. I am able to stay in closer touch with people I know on multiple continents than I would have at any time in the past. Yes, it is true that, in centuries past, there were great letter writers who had such correspondences — ones that were much more intimate than seeing a picture float past on a social media platform, but those people tended to have access to servants to handle things like getting dinner ready. Now, such connections are available to anyone who can afford the technology and the fees required to connect to the internet[1].

No, I am more concerned about our collective reactions to the koanic nature of Twitter and, to a lesser degree, other social media outlets. Koans are difficult and knotty things designed to make one reflect deeply on the nature of self and one’s relationships to the universe. While I don’t think most tweets rise to the level of koans, they do touch a similar nerve in us. It is our reaction to them that is telling.

Ihnatko’s statement is not a judgement of any individual and he has repeatedly said that there is nothing wrong with wanting an iPhone X and taking pleasure in getting one. That did not stop me (and others) trying to justify their feeling of good fortune at getting it sooner rather than later and cause others to insult and demean him. 

I would submit that this tells us a great deal more about those of us responding (Let’s face it: The fact that I am writing this means that I am responding at much more length than those on Twitter.) than it does about Ihnatko or the iPhone X. 

This soul-searching is probably more than a little necessary. The technological revolution we are experiencing will likely be looked back on 500 years from now as a Renaissance that dragged us out of a kind of Dark Age that began sometime around 1914[2], if not before. It is changing the world as significantly as the introduction of movable type to the West in the mid-1400s and it is doing so more quickly. We need, as individuals and as a society, to puzzle out how we will relate to these devices, given how intimate these relationships have become. 


[1] These costs may not be trivial, but they are less expensive than maintaining an aristocratic lifestyle.

[2] Another potential date would be 1789. In cases like this, however, it is best to leave it to the future. They will have a much better perspective on things.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

A Tip: Presenting with an iPad

About a year ago at a conference, someone looked at my iPad just before I got up to present a paper.[1] That what I was doing came as a surprise made me think I should pass this tip along to readers here.

This is a screen shot of the paper I recently submitted for the conference proceedings[2] of the Yeats Society of Korea's 2017 International Conference on W. B. Yeats and Movements in Literature, Art and Society in Seoul:

IMG_0237.jpg

And here is what appeared on my screen while I presented.

IMG_0236.jpg

It doesn't take a rocket scientist to increase the font size on an electronic device. What it does require is for us to not loose sight of what we can do with a digital-first document. It is easy, after all, to get attached to the thought that the time of the presentation is tightly tied to its length.[3] But once the paper is completed (Well, as completed as any presentation draft gets....), we are free to change its appearance to suit our immediate needs. 

The sharp eyed of you will notice that these are two separate files. I use a duplicate of the completed draft because it invariably needs editing to account for the fact that you can do things in writing that you cannot in speaking -- and visa-versa. Long sentences, for example, can be complex on the page or screen (see footnote one below) without risking losing a reader in a way that they cannot when a speaker encounters a listener.

Using a synched electronic copy also means I have a backup. If something goes wrong with my iPad, I can pull out my phone and access the file. It may not be as convenient but it sure beats having to try to receive your paper from memory.

Incidentally, this approach also works with paper printouts and on laptop screens. These methods have some drawbacks, of course. Printing in a larger font means more pages and an increased chance of the pages getting shuffled (I always make sure to have the page number formatted as "X of Y pages".) and laptops are more awkward while standing at a podium and it is not as easy to scroll through a document while presenting as it is on an iPad. These are, however, things that can easily be worked around if you haven't jumped on the iPad bandwagon.

---------------

[1] For those outside of the humanities who may be more used to other methods of presenting (e.g., poster sessions), I generally give a 20 minute presentation when I attend a conference as part of a 90 minute long panel. Any time that remains after the three presentations (additional time is eaten up by introductions, people getting up and sitting down as one speaker makes way for the next, people running over their allotted time, and the like) is a Q & A and discussion period. Immediately following this, everyone rushes for the bathrooms and/or the coffee station.

IMG_2106.jpg

[2] One of the nice things about the Yeats Society of Korea's conferences is that the proceedings (seen below, with a pen added for scale) come out before the conference and are distributed to the attendees for their use during the event. As such, we all have the papers in front of us and can make notes in them as we listen. This past year, we also received it electronically and I was able to use Goodnotes to annotate the document.

[3] For those who have not done this regularly, a twenty minute paper is roughly eight double spaced, 11-12 point pages long. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Tyrannies we Falsely Blame on Technology

As this occasional blog may indicate, I am far more technophile than technophobe. Indeed, I prefer to embrace titles like “technophile” and “early-adopter”(^1) to stave off alternate images that would align my behavior with addictions — a case of Star Trek’s holodiction writ small — or as an embracing of wish fulfillment/delusions of grandeur of my being a superhero like Batman, with his voice controlled Batmobile and other wonderful toys, or Tony Stark with his computer assistant Jeeves.

I can stop any time. Really.

My predilection for technology has led me to think about technology and its use in the classroom on more than one occasion. Indeed, a search of iTunes will yield four Summer Institutes(^2), generously funded by the Andrew W. Mellon Foundation, that I organized which focused on Technology and New Media within the academy. Most of my personal focus has been on how the small, incidental uses of technology can improve the life of a faculty member and the experience of students in the classroom rather than on large scale initiatives — how a service like Periscope, for example, can come to your aid when you have to stay home with a sick child as opposed to an analysis of how to roll out online learning campus-wide for all faculty and students. I know people who do the latter (and try to work closely with them) and respect their work.  It’s just that is not where my active interest currently lies.

As with all things, both levels of interest in ed-tech run the risk of losing sight of first causes — the underlying assumptions and needs that drive our decisions. Recently, it took me a surprising amount of effort to trace back to first causes my discomfort with a story that, when I read it, I thought I should be excited by. Union County Public Schools in North Carolina (I am pleased to say my daughter attends a school within this system.) published a piece well worth reading on how Vinson Covington, the AP European History Teacher at Parkwood High School, was getting his students to create a mobile app as a vehicle for learning about history.

Before I go any further, I want to make one thing clear. I think this is a fantastic, inventive idea and that Covington should be applauded for his work and for creating an environment where his students are engaged and are challenged to think about the subject differently. Nothing that follows should be seen as taking away from this, my personal and professional (for what little that is worth) assessment of what he is doing or take away from my hope that I see more teachers doing cross-disciplinary and interdisciplinary work like this at all levels of education. It is absolutely critical for all of our futures.

But as I was writing, I read this article and knew I should be interested an excited by it. Instead, I found myself disquieted. My first response to this disquiet, which I shared on Twitter, was that I would have felt better if it was part of a team-taught course, where the coding and the history could both be more fully explored by the students. And while I still think that, I no longer believe that is the source of my disquiet. Team taught courses are great but, from a staffing point of view, only occasionally practical. The kind of thing that Covington, on his own initiative, is doing here is a solution to a real zero-sum game that administration plays when trying to best deploy the limited manpower available.

Ultimately, I believe the source of my disquiet is the underlying assumptions about which disciplines should make space for others and how that space should be created. Those assumptions are building a hierarchy that many insist does not exist — even as they participate in building and reinforcing the hierarchy. 

In Covington’s case, there is no sense — even in my mind — that it is wrong for history faculty to introduce a coding project into their classroom. Indeed, I remain in awe of what Mike Drout and Mark LeBlanc accomplished and continue to accomplish with their Lexomics Project and know that I need to find the time to use their tool to satisfy some of my own idle curiosities.

To illustrate my concern, consider how non-English and Language Arts faculty react when they decide that their students cannot write well. They turn to the English faculty and ask why they have not taught the students better and look to them to provide solutions. There is no perceived cultural pressure on the non-English faculties to introduce writing into their areas in the way there is to introduce coding into history, as in the case of Covington’s work.

And I hasten to point out that the kind of cultural pressure I am pointing out is not just negative pressure. Covington has been singled out for praise here for his innovation. Can you conceive of an article that praises a member of a Biology faculty for having Pre-Med students write sonnets to improve their writing and interpersonal skills? Can you see that article treating such assignments as anything other than silly? Or as a waste of time that would be better spent on the difficult subject matter material that students are perceived as needing to cover to succeed in medical school?

And yet, no one will deny that understanding how World War I started is easy or unimportant. After all, understanding a highly mechanized, semi-automated system of distributed but telegraphically linked command posts with a series of standing orders that, once begun, cannot be stopped without destroying the system (i.e., the Schlieffen Plan) might be analogous to our contemporary computer-controlled military systems might be what prevents World War III. And learning about sonnets and people’s emotional reactions to them might help a doctor have a better bedside manner or a sufficiently greater sympathy with a patient that lets them notice that, despite the glitter of their walk, their patient may need help. It might help those employed by insurance companies see less of the paperwork of procedure and more the people trapped within the system.

Innovation, then, must not be seen as a one to one correspondence with technology, science, and engineering. Innovation is when we take new ideas and apply them in any field. The unfortunate truth about the way we are currently recognizing innovation in the Academy is that we have tied it too closely to the use of technology — so closely that we can no longer see when innovation is taking place through other areas. This matters not just for humanities faculty who might fear they are becoming second-class citizens within their own disciplines. It also matters to faculty innovators like Brendan Kern, whose podcast on the life of an imagined exoplanet can teach students about biology through the exploration of an alien, new world. Such work is currently more likely to be advertised as “fun” or as a bit of fluff rather than a serious attempt at pedagogical development and innovation that might make material accessible to students.

Whether we in at all levels of the Academy choose to see innovation more broadly than the infusion of STEM and its wonderful toys into other disciplines will determine how likely we are to promote and recognize real innovation across all disciplines. It will require challenging many of our assumptions about how we do things and how much of a king our disciplinary content actually is. It will be difficult for many of us to do this. After all, it is easy to give the appearance of innovation if you see people working on robots or a flying car. It is less easy to do so when you watch someone telling or discussing a story. But both of these represent the skills our students will need to be successful and adaptable in the 21st Century. We must, then, learn how to refuse to be hidebound.

——

 1. For those that are curious, I added the hyphen because of some research conducted by Sparrow Alden, who noticed that, in The Hobbit, J. R. R. Tolkien appeared to hyphenate certain two word phrases to indicate that they stood in for what would have been a single word in Westron (the common language of men and Hobbits in Middle Earth) and come down to us as kennings (https://en.m.wikipedia.org/wiki/Kenning). Although early adopter is not traditionally hyphenated and is not as figurative as oar-steed or whale-road, it is nevertheless true that being called an early-adopter signals more than just being the first kid on the block with a new toy.

2. Or you could follow these links:

 The First JCSU Faculty Summer Institute for Technology & New Media

The Second JCSU Faculty Summer Institute for Technology & New Media

The Third JCSU Faculty Summer Institute for Technology & New Media

The Fourth JCSU Faculty Summer Institute for Technology and New Media and Problem Solving in the Interdisciplinary Humanities


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Importance of Note Taking Apps

The GoodNotes app published a blog post about how Shirantha Beddag uses the GoodNotes app in her teaching.

While the blog post is worth reading for anyone who teaches music, it is also good reading for the rest of us. GoodNotes (and similar apps like Apple’s Notes and Notability) are foundation-level apps for those of us in education. As such, they run the risk of disappearing into the background. 

But these note taking apps are the kind of things administrators, faculty, staff, and students alike will use day in and day out. For that reason, it’s important to find an app that fits your needs. And, quite frankly, they are one of the biggest reasons to go with an iPad Pro and Apple Pencil combination. We may not all be ready to take advantage of the artistic potential offered by ProCreate. Everyone who has meetings inflicted upon them need a way to take notes.

For me, the reason that I ended up going with GoodNotes is its use of notebooks organizational metaphor rather than a system that hews closer to a computer file system, which is what is found in Notability and Apple’s Notes app. All three are strong, and there are others that are worth looking into. The strength of the App Store is that it provides options that users can weigh. For some students, the ability to record a lecture may be the killer feature, as opposed to GoodNotes’ ability to search handwritten text.

So how will you know, short of downloading them all and playing with them? Fortunately, someone has done that for you already. Serenity Caldwell has a great run-down of several apps on iMore. Also, ask around and see what your colleagues are using. I’ve noticed people who have chosen an app, rather than just going with a default, are happy to show you how their app helps their idiosyncrasies. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Apple Product No One is Talking About (and Why Educators Should Care)

In the space between iOS11 and MacOS High Sierra, I wanted to offer a few words not he big product Apple announced that (with some exceptions) has gotten little to no air time. 

I'm talking about the Apple Town Squares, formerly known as the Apple Stores.

It's worth listening to what Tim Cook and Angela Ahrendts talk about, in terms of the relationship of space and purpose, beginning four minutes and four seconds into the Apple Keynote of 12 September 2017. They spoke of the design philosophy behind Apple Park and the Apple Town Squares. Cook spoke about how Steve Jobs set out to "inspire talented people to do their best work" and that the retail spaces were designed to be "about learning, inspiring, and connecting with people" as much as it was about retail. Ahrendts spoke about the redesign of the flagship stores to serve as "gathering places for 500 million people" and that they are "Apple's largest product".

If you listen to the way they spoke about Apple's facilities, they can easily describe some of the central aspects of a University. In case that is less than immediately apparent, here are some of the parallels between Apple's facilities and a university's -- if the clearly intentional parallel was not driven home enough when Ahrendts linked the Creative Pros to the Liberal Arts and the Genius to Technology (STEM, for those attached to the current nomenclature of the university):

The Plaza = The Quad

The Forum = The Classroom

The Board Room = Library Study Spaces

The Genius Grove = Faculty Office Hours

The Avenues = The Student Union, with its opportunities for students

Her talk highlighted Apple's efforts to offer Lifelong Learning opportunities -- for free -- to its community. Those who are in the business of supplying such opportunities to potential students at local universities should both take note, as Apple appears to be doing this in a more compelling and targeted manner than universities are, and be alarmed, as Apple is beginning to eat the lifelong learning lunch.

There is more, however. Apple Park and Apple's Town Squares are described in a language that, as I suggested, parallel the university. Given that universities are trying to figure out how to bring students to their campuses in an age when, quite frankly, there aren't enough students to go around and online learning and similar innovations might keep potential students away, Apple's successful strategies should be examined closely by those teams trying to figure out how to differentiate their campus in a way that makes it a compelling destination.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

An Observation about My Glasses

So, a couple months ago, I had to replace my glasses (The prior pair's arm had broken in a manner that could not be repaired.). Having reached a certain age, I needed progressive lenses (I had them in the last pair, which means I reached that certain age a while back.).

Now, as this (intermittent) blog indicates, I went all in on iOS a long time ago. That said, I have a Mac Mini at home to serve as a base of operations and a Mid-2011 iMac on my desk here at work. One of the things that has troubled me about this pair of glasses is that the progression is off. In my last pair of glasses, I would look at the iMac's screen and all would be clear. Now, it is fuzzy and I have to tip my head back.

Just moments ago, I finally figured out why. The progression is not set for people who work on a desktop. It is set up for people who work on a laptop or a mobile device, like an iPad or an iPhone.

A number of the podcasts I listen to are populated with people I respect and listen to bemoaning about the death of the Mac in general and desktops in particular. Apparently, it is not just computer companies that have begun to adjust to a post-PC-of-their-memory/imagination world.

There is more potential angst to this than that supplied by aging tech pundits. This kind of change should be noted by the academy, which is a much more conservative and stodgy group than usually imagined. Perhaps we should begin to consider changing some of our ways if and when our glasses are letting us know that the world has moved on.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

What is So Different about the Three Current Categories?

There are currently three broad categories, each of which can be subdivided, of devices that users can now choose from:

Mobile: This category includes both handhelds (mobile phones and devices like the iPod Touch) and tablets. While these each solve a specific use case within the category, they all have mobility and a touch-first interface as their two primary features. They also provide their users with a task-based focus through the use of apps instead of applications.

 Strength: Portability of the device and the focus of the apps.

Needed Leverage:  An always (or almost always) available internet connection for extensive file storage.

Weakness:  Not as powerful, in terms of computing power and application flexibility.

Laptops: This category includes all devices that are portable but require a keyboard. Yes, they may have touchscreen capability, but the primary input is designed to be through a keyboard. They share with their desktop-bound counterparts an approach to tasks that uses application, rather than apps. This represents not only a difference in focus but also in the number of tasks that a user can pretend they are dong at the same time. While the latest iPads can run two apps at once, laptops and desktops can have multiple windows open at the same time.

 Strength: Balances the power of the Desktop with the Portability of a Mobile Device

Needed Leverage: Consistent access to power to recharge its battery.

Weakness:  Heavier than a Mobile Device.

Desktops: These devices are designed to stay in one place and provide significant computing power. Quite frankly, they provide far more computing power than most users need. More important for most users, these devices provide a significant amount of on-device storage for large libraries of files.

 Strength: Raw computing power and the capacity for a lot of local file storage.

Needed Leverage: An Internet connection for off site remote access when you are away from your desk and a device to access things while out of the office.

Weakness: Immobile and, for some, too many windows.

 

Seen in this light, the laptop, which was once the way to get work done on the go, is now a compromise device -- a role usually given to mobile devices (It should be noted that this role is usually assigned by people who use laptops.).

It is also worth noting that there is no longer a primary/base-line device for people to use. All three categories are viable computing choices for users. The question, then, is what does the individual user need at the time (given the constraints of their purchasing power). This is something that strikes home every time I come to Asia. You do not see people using laptops as a primary computing device. You see them using a larger mobile phone. It is a trend, incidentally, that educators should keep in mind, as students everywhere appear to be shifting toward this model and it represents a shift as significant as the move from punch cards to keyboards and text-based to GUI-based operating systems.

The importance of use case here is a critical one in education, which has three primary user groups: students, faculty, and administration/staff. Of the three, the individuals most likely to be tied to a desktop are the administration and staff, who work at their desk (when they are not in a meeting). Professors are semi-mobile users, as they move from their offices to a classroom. Students, however, are clearly mobile users. They move from place to place throughout the day, as they go from class to class. 

It is worth keeping this in mind when making decisions about the technology that students will be issued or be asked to purchase on their own. Mobility and portability, although they may not always recognize it, is an important factor in their interaction with technology. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

My Biases

The most important thing you should know about any reviewer (whether it is an individual reviewer like Roger Ebert or Andy Ihnatko or a corporate reviewer like the Sweet Home or Consumer Reports) are their preferences and biases. If you know these, you can gauge what part of their response to the thing under review you should listen carefully to and which part you should dismiss. I stress the you here because it is absolutely not the generic you. It is you personally. What I should ignore about and focus on in a review is not the same thing as what someone else should ignore and focus in on.

The iPad Pro

My first bias that you should know about is that I really, really like the iPad mini's form factor. I love how small it is, in terms of its portability and use. I don't find the screen too small to type on (because I have fully adjusted to it) and it packs easily into the slim line bag that usually serves as my briefcase.

As a result, the iPad Pro feels, even after several weeks of use, too big.

I freely admit that the size provides some great advantages. When editing text, a lot of screen real estate is nice -- especially when the iPad Pro is in portrait mode. The difference between using a note taking app like Notability or working with a PDF in an app like iAnnotate is the difference between using a steno pad and a legal pad.

If you travel economy, the screen size can be an issue -- as I will discuss in more detail in a later post. Once it is set up with the keyboard, it gets crowded. It is possible to use the onscreen keyboard lying flat on the fold-down table, but the angle is less than ideal.

And yet, as I type this and look at the screen, it still feels too big. I don't like that I am carrying around my old notebook backpack again (although its larger capacity is more than a little useful -- especially when traveling.) I had thought that I would have adjusted to both

the size and the bag more by now. Since I haven't, you should know that I appear to remain biased against the size.

The Apple Pencil

The Apple Pencil is a game changing device. I was not prepared for how much of a difference it makes even when compared to some very good styluses.

From the moment I figured out how to get a paper onto my iPad for marking, I have been using a stylus and I have found two that are -- at least for me -- worth using.

One is Adonit's Jot Pro. The illusion that the clear Precision Disk creates means that I feel like I can be much more precise in my lines. The first generation does have a problem, in that the nibs do eventually wear out and, once that happens, it quickly becomes unusable. The company does offer replacement nibs for purchase and the second generation of the stylus, it is said, has improved reliability (I haven't used it, so I can't comment on whether this is true or not.). Still, it is well worth using and I still have a Jot Mini in my bag.

The other is Applydea's Maglus stylus. I initially backed this as a :fund:it project (an Irish crowdfunding site) and have not regretted it. Before the Apple Pencil came in, I replaced my original, which had developed a small tear in the nib after many years of use, with a next generation model. The replaceable nibs are a nice feature and I am really impressed with how much of an improvement the microfiber nib makes, in terms of the feel. I would like to tell you about the graphite nib, but it was on back order. Whatever happens with this test, I expect that the graphite tip may mean the Maglus is in my bag even if I have access to an Apple Pencil. As I am not an artist, I am not quite sure what to do with the brush nib -- although it is cool to look at and fun to play with.

Even though I still recommend these two styluses, I am amazed by the feel of the Apple Pencil. When I now electronically sign a PDF, I no longer zoom in on the signature line -- as I do even with the high-quality Adonit Jot and Maglus. I can just sign like I would if I was working with paper. The same is true with note taking using Notability. I used to use the magnified area at the bottom of the screen to write my notes (Incidentally, cursive is much easier to write in than printing on a glass screen. Given the choice, I tend to print when taking notes on paper. On the screen, I almost exclusively write in cursive for the speed and for the fewer clicks it produces as the stylus hits the screen.) For an art app like Paper by 53, it is stunning.

I mention this because I did not expect to be overwhelmed by the Pencil. It, more than any other single thing, is what has made me think I should conduct this test.

The Apple Keyboard

I am troubled by the keyboard.

I am not troubled by its design or its keys. I know that internationally beloved technology columnist Andy Ihnatko does not like the feel of the keys. Like iMore's Rene Ritchie, I find them very usable.

Indeed, I think they may be too usable.

I cannot shake the feeling that all external keyboards are legacy input devices. It is something that watching those who use their mobile phones to text and search brings home. And while the individual user choice is, and should be, a matter of personal preference, it is something that educators need to consider. What should we use, model, and expect students to use as they are positioned for a future that not only includes next year but ten years from now. I cannot shake the feeling that in ten years, when my daughter heads off to university, she will not be using an external keyboard. And while it is true that I want her to have every advantage possible, it is equally true that my current students will eventually be her competition in the workplace. And if they are wedded to legacy devices, they will be at a clear disadvantage.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Experiment

I have been at Guandong Baiyun University for four days, giving talks and doing some set-up work for the Guandong Baiyun University Center on American Culture and Race -- which, with the support of the US Embassy in Beijing, Johnson C. Smith University is establishing here with our partner. My set-up included preparing three iMacs, ten iPad minis, and ten iPod Touches for use by those visiting the Center so that they can listen to the podcasts and view the vidcasts and other material we will be uploading to promote mutual understanding between the US and China.

During the time I have been here, I have been using an iPad Pro. That is why I am writing this blog.

Traveling here with the iPad Pro is part of a larger experiment. In brief, we wish to determine if it can replace a faculty member's desktop computer for a month (Dec. 15 to Jan. 15 -- a time frame that includes travel, a break, an advising period, and, of course, teaching) and to document the successes and the pain points associated with such an experiment.

Yes: desktop.

Quite frankly, much of what is written about whether or not an iPad is capable of replacing a laptop is reductive. The iPad has been able to replace a laptop for several iterations. Indeed, I ceased to use a laptop computer soon after the first iPad was released. Much like Serenity Caldwell did recently during her iPad Pro Experiment, I closed my laptop for a week and tried to see if I could successfully complete what I needed to do with just the iPad. I have written and edited full length articles on iPads for years (You have actually been able to write articles on an iPhone using only the Notes app for a long time. The screen size just makes it inconvenient. Inconvenient is not the same as impossible. After all, the screens on the typewriters that came out at the same time as the early PCs didn't have a screen that could show nearly as much as today's phone screens.)

There was, of course, a learning curve. I discovered, however, that the overwhelming majority of my tasks could be completed on an iPad and that half of those I could not complete on the iPad wear do to artificial constraints imposed elsewhere (IT staffs have since come around to supporting mobile-centric computing. Indeed, many have embraced it with a fervor that equals or exceeds my own.).

I also learned that many of the perceived constraints of the first generation iPad had less to do with the device and more to do with me. Typing on a glass screen was initially alien. After a week, however, it was normal and physical keyboards felt strange. Yes, the virtual and physical keyboards were different but my initial hesitations had to do with adjusting to what was new rather than what was better or normal.

Disclaimer: While I am not a slow typist, I do not move at the speeds of many professional writers. I never took a typing class so never learned to touch-type. For some, the changeover to a glass screen from a good keyboard involves a noticeable degradation in typing speed. For those whose livelihood depends on the number of words on the page/screen, it wouldn't be a good idea to threaten one's livelihood with retraining. For future generations, however, it is worth remembering the angst that accompanied the shift from Typing 101 to Keyboarding 101 in the 80s and the questions about speed and appropriateness of changing what is taught to students. Because, after all, there were going to be typewriters in offices for a long time and not everyone would have access to computers.

Whether an iPad can replace a laptop is a question based on a series of false assumptions and comparison, as was humorously demonstrated by Fraser Speirs' review, asking if the MacBook Pro can replace your iPad. Most comparisons are not as fair as this one, as they tend to compare a single product with three primary expressions (the iPad mini, iPad Air, and iPad Pro) to a class of products (the MacBook, MacBook Air, MacBook Pro, to use Apple's line of laptops as a point of comparison). There is a huge range of capabilities between these three laptops and, for some users, the iPad mini, the MacBook, or a Chromebook are equally unusable because they need the features available at the Pro level.

With iOS 9, the iPad Pro can, in most cases, easily replace a laptop -- as Speirs has outlined on his blog. The obvious next question is if the iPad can replace a Desktop. Of course, given that laptops can replace desktops, Speirs' experiment could be said to have already been completed. But technology usage, like all politics, is local. Can his experiment be completed here?

And, as with most things, the only way to discover is to do.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.