Quick Thoughts: Broken Links in Beowulf

In the first of his three Signum Symposia on the J. R. R. Tolkien’s relationship to Beowulf, Professor Tom Shippey discussed the lists of people who are referenced in passing at a variety of points in epic. In his discussion, he comes down on the side of those who argue that many of these references are allusions to now missing stories. Unferth, for example, remains honored at Herot even though there is a reference to his having been involved in the death of his brothers — an incident that should have marked him indelibly with dishonor.[1] We don’t get the story but the text seems to expect us to already know it.

While listening to the Symposium again on the way to work the other day, two metaphors for this loss came to mind. The first has a direct application to this blog: These stories are broken hyperlinks. As we drift towards next generation texts, allusions will increasingly appear in this technological form — links to click or words that, when tapped, will produce a box summarizing the connection.

To understand this change, however, we have to cease to think about high literature, as we think of it today. Yes, the Modernists alluded to other works all the time, as anyone who has looked at T. S. Eliot’s The Waste Land can tell you.  But even though Eliot wants you to remember the Grail stories in general and Jesse Weston’s From Ritual to Romance in particular, this act of allusion is different from the kind of nod that the Beowulf poet, Chrétien de Troyes, and Tolkien engage in. Their allusions are less scholarly exercises and more the calling up of the kind of fan knowledge possessed by those who can tell you about the history of their favorite super hero or the long history of ships named Enterprise. It is the difference between connecting single stories to other ones and seeing the whole of a Matter, in the way we talk about Arthurian legend being the Matter of Britain and the tales of Charlemagne and his paladins being the Matter of France.

Beowulf can thus be imagined as our reading a partial comic book run.

This difference might help us with our students, who are more likely to possess the latter kind of knowledge about something (e.g., their favorite TV show or sport team) than the former. We might also benefit from spending some time considering whether the allusions within high literature, as it is imagined by the inheritors of the Modernist enterprise, isn’t just a dressed up form of what scholars sometimes dismissively call trivia.


1. I would mention to those not as familiar with Beowulf that kinslaying is at the center of the story. Grendel, for example, is a descendant of Cain. The Finnsburg episode vibrates with the issue. Beowulf himself silently refuses to walk down the road that might lead to such a possibility when he supports his nephew for the Geatish throne rather than challenge him.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

 

On the Need for a New Rhetoric: Part V — The Beginnings of a New Rhetoric

To recap, for those who are joining us now but not quire ready to review four blog posts of varying length, we are confronted with a sea change in writing — whether you look at it from the point of view of a practitioner or scholar. Our means of production have sufficiently changed to shift composition and distribution. The old system, which involved separate, paper-based spaces for research, drafting, and production has been replaced by digital spaces that allow for these to take place within a single, evolving file. To use an old term, we all now possess an infinitely cleanable palimpsest, which can incorporate audio-visual material alongside the written word, that can be instantly shared with others — including those with whom we might choose to collaborate. 

This change not only has changed the way we write, it necessitates a change in the  way we teach writing and approach the idea of composition.

Having raised the issue, I am obligated to provide some thoughts on the way forward. Before doing so, I wish to stress something: Although I have, like most English professors, taught composition and rhetoric courses, I am not a Rhet-Comp specialist. There are others who have studied this field much more closely than my dilettantish engagement has required. I suspect that it is from one of them that the better answers will come about the merging of aural, oral, and visual rhetorics. That said, this path forward cannot begin without us addressing the tools of the trade.

We must begin to teach the tools alongside the process of writing. 

One of the first step for any apprentice is to learn their tools — how to care for them and how to use them. Masters pass on the obvious lessons as well as the tricks of the trade, with each lesson pitched to the level of the student and focused on the task at hand.  Those who teach writing must begin to incorporate a similar process into writing instruction. Indeed, if you consider the process described in the Part II of this series, I described a tool set that was explicitly taught to students at one point in the past. 

As much as I would like to say that this should be done within K-12 and university professors like me can abdicate any responsibility for this. The reality is, however, that this kind of instruction must take place all levels by faculty in a variety of disciplines. This breadth is demanded by the reality of the tasks at hand. A third grade English teacher will be focused on a different skill set, writing style, and content-driven focus than a university-level Chemistry instructor will. They will be engaging in different kinds of writing tasks and expect different products. Each, therefore, must be ready to teach their students how to create the final product they expect and there is no magic moment when a student will have learned how to embed spreadsheets and graphs within a document. 

This is no small demand to place on our educational system — especially upon composition faculty. Keeping up with technology is not easy and the cast majority of those teaching writing are already stretched dangerously thin by the demands of those attempting to maximize the number of students in each class to balance resources in challenging financial times. Nevertheless, the situation demands that this become a new part of business as usual for us.

We need to adapt to the tools that are here rather that attempt to force prior mental frameworks onto the tools.

Those of us who were raised in the prior system might have students try and adopt a` “clean room” approach about the research — keeping separate files for the notes associated research and the final document, for example — in order to replicate the notebook-typescript divide described before. There is a certain utility to this, of course, and there is nothing wrong with presenting it as an option to students as a low cost, immediately accessible solution to the problems inherent in the Agricultural Model. And this system will work well for some —especially for adult learners who were taught the Architectural Model. To do so exclusive to all other approaches, however, is to ignore new tools that are available and recognize that students have their own workflows and ingrained habits they may not be interested in breaking. The options provides by Scrivener and Evernote, for example, may better provide for students’ needs. And while there is some cost associated with purchasing these tools and services, we should not let ourselves forget that notecards, highlighters, and the rest of the Architectural Model’s apparatus were not free either.

We must be more aware of what tools before us are for and apply that knowledge accordingly.

If all you have is a hammer, the saying goes, everything looks like a nail. The same metaphor applies to word processing. 

If you are word processing, the assumption is that you are using Word. For the vast majority of people, however, using a desktop version of Word is overkill. Most users do not need the majority of the tools within Word. This does not make Word a bad choice for an institution nor does it make Microsoft an inherently evil, imperialist company. Microsoft developed Word to solve a set of problems and address a set of use cases. 

Why this observation matters is conceptual. Many institutions focus on teaching students how to use the basic functions of Word because it is a standard. Because the accounting and finance areas want and need to use Excel, it makes sense for the majority of companies to purchase licenses of the Microsoft Office suite. As a result, most working within a corporate environment — regardless of operating system platform — will find Word on their computing device for word processing.

If all these users are likely to do, however, is the ability to change a font or typeface, apply a style, and control some basic layout (e.g., add columns and page or column breaks), there is no need for an instructor to focus on teaching Word. They can focus instead on the task and the concerns that the faculty member is addressing (e.g., the appropriate format for the title of a book).

Yes, it will be easier to standardize on a device platform for instruction — especially since, as Fraser Speirs and others have pointed out, faculty need to have a common set of expectations for what can be expected of students and serving as front-line technical support. 

That said, institutions should consider carefully their needs when it comes to purchasing decisions. For the vast majority of students at most educational levels, there is no difference between what they will do in Apple’s Pages, Google’s Docs, Microsoft’s Word, or any of the open source or Markup-based options, like Ulysses. And the choice should be made based on the utility provided rather than a perceived industry standard. For long-form publishing, Word may be the be the best answer. If students are going to do layout incorporating images, Pages will be the stronger choice. 

For some, these three points will feel sufficiently obvious at to wonder what we have been doing all these years. The simple enough answer is that we have been doing the best we can with the limited time we have. These recommendations are, after all, additions to an already over full schedule. They are also changes in orientation. A focus on the tools of writing, rather than on the writing process, will be a change. For the reasons outlined in this series, however, I would argue that they are critical ones. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

 

On the Need for a New Rhetoric: Part IV — The Changing Text

In my last post, I considered the change from an architectural model of composition to an agricultural model of composition. If we were facing just this change, it would be sufficient reason to change our approach to writing. This shift, however, is not the only transformation occurring. The capabilities of the texts we are creating are changing as well. 

Back when research was done on 3” x 5” notecards, the medium of final production was paper — whether the final product was hand-written using a pencil or pen, typed, or printed using a black ink dot-matrix printer. Now, digital-first documents are printed — if they are printed — on color laser or ink-jet printers.

The key word in that sentence is if.

Whether it is the “papers” uploaded to class portals or the emails that have replaced inter office memos, much of what now comes across our desktops are digital-only documents. A growing number of these include more than just text. They include images, video and audio, and hyperlinks that extend the text beyond the borders of the file. These next-generation texts offer those composing the ability to embed source material rather than summarize it. A discussion of how multiple meanings in Hamlet lead to multiple interpretations on the stage, for example, could include clips from different performances in order to demonstrate a point.

This is not the time to go over all of the implications of multimodal, next-generation texts.[1] It is enough for us to recognize that digital-only documents exist and require us to take them on board as we develop a new rhetoric — one that requires us to consider  visual and auditory rhetoric and layout in addition to the written word. 


1. The ability to hyperlink to sources and, at times, specific places within a source should force a reconsideration of citational methodology, for example. Current style guides assume a paper world rather than a digital one. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

 

On the Need for a New Rhetoric: Part III — The Agricultural Model of Writing

In my last entry on this topic, I took us back to yesteryear and described to those younger than the 5 and a half inch floppy disk how research was once taught. Whether students did all of these things or not, the overarching system for organizing research was propagated and taken up into the imagination of students as they left high school and went off to college and university and then on grad school before returning to the classroom to teach.

Slowly, however, the tools changed. First, photocopying became available to anyone with enough quarters and different color highlighters were grafted on to the architectural method. Computers became available and the internet provided more sources — sources that could be printed out — requiring more highlighters. Like the printing press, accurate reproduction of information became trivially easy and index cards were replaced by three-ring binders with, for the more obsessively organized, dividers. Others gathered them into loose piles of paper that joined the books stacked near the writer’s computer station as they worked.

Then computers became portable.

This slow change may seem like a small thing in this progression but it is, I would argue, a critical one. When a computer can be carried to the place of research, there is no need for a photocopy or printout. All that is required is to take the information and type[1] it into a file that is saved to memory — whether that memory is a 8”, 5 1/4”, or 3 1/2” disk; a spinning platter hard drive; a USB thumb drive; a solid-state hard drive; or cloud storage service, like Dropbox or iCloud.

As anyone who has taken notes in a word processor can tell you, there is a huge temptation to begin evolving the notes into a draft rather than creating a new draft document. Indeed, it is logical to do so. All of the research is there, ready to be re-ordered through the magic of cut-and-paste then written about when the referenced material — whether they be quotations or notes — is onscreen awaiting response. This approach keeps the material fresh in the mind of the writer while enabling them to take advantage of the benefits of digital composition.

I suspect that will sound familiar to many reading this. I also suspect it is the method most of us now use when composing — whether we were trained in an architectural model of research or not. 

This approach to composition can be seen as an agricultural model of production — one where ideas and information are seeded into a document and then organically grown as the work-in-progress develops throughout research and writing process.

For all of its advantages, and those advantages[2] are significant, there are major limitations to this approach. Skipping the step of transmitting information from one document (say, a notecard) to another promotes accidental plagiarism by increasing the chance that a note will inadvertently become separated from its source. It also makes less clear to the writer who crafted a particular turn of phrase as notes are transformed into the draft. In addition to the problem of plagiarism, growing a paper (rather than building it) trades the organizational system that is created when a writer has to formulate multiple outlines to order their research and writing for the less rigorous world of headings scattered through a draft. It also skips the step where the organization of a writer’s ideas is initially tested before the first word of the first draft is written.

These limitations are less a function of the tools at our disposal than they are the absence of a method that embraces these tools. To push the metaphor, we are at the hunter-gatherer stage of the agricultural model, where means of storage have been developed but we have not fully developed a system for cultivation.

Our means of production has changed. Our pedagogy has not.

This is the core reason that we need to create a new rhetoric — one that accounts for the new method of textual creation that digital composition allows and that embraces the ability to incorporate media into what was once a static document.


1. Typing, of course, is no longer the only way notes are taken. As anyone who has seen lines forming at the whiteboard at the end of a lecture or meeting or watched people hold their cell phones and tablets up to get a quick shot of a presentation slide can tell you, photography has become equally as important for information capture as note taking.

2. To name a few of the advantages: Portability of the research once it has been done, the ubiquity of high quality electronic resources, and a superior means of production. No matter what the hipsters and typewriter aficionados tell you, word processing is superior to typing for the vast majority of users most of the time. This does not mean they are wrong about what work for them — there are times when I feel compelled to write out ideas or to do lists using a fountain pen. But I am under no illusion that there is great benefit in putting that to-do list into Reminders, Todoist, or OmniFocus. I just wish I could decide which of these digital tools work best for me.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Algorithmic Vulnerability of Google and Facebook

New York Times reporter Rachael Abrams wrote this week about her recent attempts to convince Google that she is, in fact, still alive. It is a damning article for a company that has, at its core, an information retrieval mechanism driving its advertising revenue stream. If, after all, users cannot trust the information a Google search provides, they will begin to go elsewhere and the user databases those searches generate will degrade and lose value for advertisers looking to target an audience.

Abrams’ article should not only be a wake up call for Google. It exposes a key vulnerability for Facebook and other algorithm-based companies.[1]

While computing power has increased and artificial intelligence has improved dramatically, we have not outstripped the need for human curation.

John Adams’ assertion to the jury weighing the guilt of British soldiers involved in the Boston Massacre that “Facts are stubborn things”  is no less true today. And despite the protests of those who don’t like to have their own biases and world views challenged, there is a difference between reputable and unreputable sources of information. When a company takes upon itself the role of an information aggregator, as Google has, or stakes out a position as a new public square, as Facebook has, it has an ethical and moral obligation to do so in good faith — even in the absence of a legal requirement to do so. Yes, reasonable people can interpret facts differently. Unreasonable people — and opportunists — embrace the factually wrong.

More importantly, however, self interest should drive them to act in good faith. Stories like Abrams’ highlight a credibility gap — one that competitors will exploit. Google was once an upstart that succeeded  because it outperformed Alta Vista and eliminated the need for search aggregators like Dogpile. Google, too, can be supplanted if its core offering became seen as second best because their search results could no longer be trusted.

Abrams’ story points to a need for Google and others to rethink their curation strategies and base them on something other than short-term Return on Investment. There are indications that this is beginning to happen, but Google’s tendency to rely on temporary workers is, ultimately, a losing strategy — one that doubles-down on the primacy of the algorithm rather than accepting the need for humans trained in information literacy and the ability to discern between correct and incorrect. These curators must have the authority and ability to make corrections to algorithmically generated databases before those databases become useless to users — whether the users are looking for a holiday recipe or looking to sell ingredients to cooks.


1. That the two obvious companies to comment on here are focused on advertising may hold a hint to an underlying issue — the unspoken argument’s warrant. The focus on generating information for advertisers has distracted these companies from the need to provide quality information for their users. The need to generate revenue is clear and understandable. They are not running charities and protesting their profit motive sounds as strange to my ear as those who were dismayed that Academia.edu might be trying to make money. The calculation with all such services must be the value proposition — is the user being provided with a service worth the cost and the service provider must make sure it does not lose sight of its users as it focuses on its profit source. The moment services like Google and Facebook become more about advertisers than end users, they open themselves up to competitors with better mousetraps — ones that will provide more value to the advertisers.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

 

On the Need for a New Rhetoric: Part II — The Architectural Model of Writing

In my last post, I offered an assertion without exposition: That writing on a computer/mobile device screen has significantly changed the model that we use for creating arguments and composing the form they take because writers have moved from an architectural model of production to an agricultural form of production. In this post, I will explain what I mean by an architectural model of composition. 

Readers of a certain age will remember research in a time before the ubiquity of the internet. In such days of yore, the well-equipped researcher went to a library armed with pencils and pens of varying color, at least one notebook, and a stack of 3” x 5” cards gathered together into multiple stacks held together by rubber bands.[1]

For those of you too young to have ever seen such a thing, or too old to remember the system’s details[2], here are how all of these pieces worked together.

To keep things organized, you started with a research outline — one that roughly laid out what you were looking for. This was as much a plan of action as it was an organizational system. It had a hypothesis rather than a thesis — the idea or argument you were testing in your research.

Once in the library, you went to a card catalog — a series of cabinets holding small drawers that contained cards recording bibliographic information. One set of cabinets was alphabetized by author. Another set of cabinets held similar cards but they were organized by subject. Each card also recorded the Library of Congress or Dewey Decimal number that corresponded to the shelf location of the book in question.[3]

If you were looking for more current material, you consulted a Periodical Index of Literature, which was published annually and contained within it entires for articles published in magazines. With that information, you could request from the reference librarian a copy of the bound volume of the periodical or microfilm or microfiche to place into the readers. 

For each source you referenced, you carefully recorded the full bibliographic information onto one note card and added it to your growing stack of bibliographic cards — which, of course, you kept in alphabetical order by author. Each card was numbered sequentially as you examined the source. 

These were the days before cell phone cameras and inexpensive photocopiers. You took handwritten notes in a notebook and/or on index cards. For each note you took, you noted the number of the source’s bibliographic note card in one corner[4] and its place in your organizational outline in another corner. To keep things as neat as possible, each card contained a single quotation or single idea. Following the quotation or note, you listed the page number. Finally, you would write a summary of the note along the top of the card to make it easier to quickly find the information when flipping through your cards.

You did this for every note and every quotation.

At the end of the day of research, you bundled up your bibliography cards in one stack and your notes in a second stack — usually in research outline order though some preferred source order.

When your research was complete, you created your thesis, which was a revision of your hypothesis based on what you had learned in your research. You then created an outline for your paper.[5] Once the outline was ready, you went back through your notecards and recorded the paper outline in a third corner of the card — usually the upper right hand corner. (For those looking to handle revisions to structures or make certain pieces of information stand out, a separate color could be used to write things down.) You then stacked the cards in the order of your outline and proceeded to writing. As you came to each point you wished to make, you hand wrote (You would not have typed a first draft.) the information or quotation, noting the source where and when appropriate. 

Then you revised and edited until you were ready to type the paper. If you were among the fortunate, you had a typewriter with a correction ribbon or had access to correction strips. If not, you got used to waiting for White Out to dry, lest you be forced to retype the entire page.

From this description, I hope you can see why I refer to this system as an architectural model. You gather raw material, shape the raw material into usable units of standardized sizes, then assemble them according to a kind of blueprint.

I suspect you can also see the sources of many of our current digital methods. To put it in the language of contemporary computing, you created a analog database of information that you had tagged with your own metadata by searching through sources that were tagged and sorted by generic metadata. The only differences here are that the database of information is stored on 3” x 5” cards rather than within spreadsheet cells, for example. 

So long as computers were fixed items — desktops located in labs or, for the well to do, on desks in offices or dorm rooms, this model persisted. With the coming of the portable computer, however, a change began to occur and writers shifted from this architectural model to an agricultural one without changing many of the underlying assumptions about how research and writing worked.


 

  1. Those exceptionally well prepared carried their index cards in small boxed that contained dividers with alphabetical tabs.
  2. I hasten to note that this is what we were taught to do. Not everyone did this, of course.
  3. What happens next depended on whether you were in a library with open stacks or closed stacks. In open stack libraries, you are able to go and get the book on your own. In closed stack libraries, you fill out a request slip, noting your table location, and then wait while the librarian retrieves the work in question. The closed stack model is, of course, still the norm in libraries’ Special Collections section.
  4. Some preferred to include the name of the author and title of the work. This could, however, become cramped if it bumped into the heading of the note if you placed it in one of the upper corners. For this reason, most people suggested placing this information on one of the lower corners of the card. I seem to recall using the lower right corner when I did this and placed the note’s location within the organizational outline in the lower left corner.
  5. Some continued to use notecards in this step. Each outline section was written on a card, which allowed them to be shuffled and moved around before they were cast in stone. 

Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

On the Need for a New Rhetoric: Part I — Setting the Stage

In 1958, Stephen Toulmin published The Uses of Argument, which espouses a method of argumentation that is one of the current foundations for teaching rhetoric and composition at many universities. In brief, it is a system for building and refuting arguments that takes into account both the claim, and the evidence required to support it, and the assumptions that claim is based upon. Its versatility and effectiveness make it a natural choice for teaching first year composition students how to make the jump from high school to university-level writing. 

As good as this system is, it does not address the largest change in writing in English since 1476 — the year William Caxton introduced England to his printing press and publishing house. Although it is almost impossible for us to conceive in 2017, his editions of Geoffrey Chaucer’s Canterbury Tales and Sir Thomas Mallory’s Le Morte d’Arthur were fast-paced, next generation texts that relied on an advanced and inherently democratizing technology. No more was one limited to the speed of a scribe. Leaves could be printed off in multiple copies rather than laboriously copied by hand.

Over the following five or so centuries, this change in the means of production has had an undeniable impact on the act composition. Dialect[1], punctuation[2], and spelling[3] standardized. Writers began to write for the eye as well as — and then in stead of — the ear as reading moved from a shared experience to a private one.

The world that the printed text created was the world within which Toulmin formulated his method of argumentation. While it used the work of Aristotle and Cicero, it no longer situated itself with an oral and aural world. A world where a reader can stop and reread an argument comes with a different set of requirements than one where an orator must employ repetition to make certain that an audience grasps the point. By way of example, the change in tone associated with Mark Anthony’s pronouncement that “Brutus is an honorable man” is more powerful when heard from the stage (or screen) rather than seen on the page (or screen) of Act 3, Scene Two of Julius Caesar.[4]

There have been several innovations and improvements along the way. But the addition of pictures for readers and typewriters for writers did not inherently change things. Yes, the material produced by a writer would looks like its final form sooner in the process but the process did not change. The limits of experimentation were bound by the limits of the codex[5] format.

With the development of the computer and modern word-processing, however, we are experiencing a change every bit as significant as the one wrought by Caxton — one that will once more change the way we compose. Indeed, it already has changed the way we compose even though we do not all recognize it. What is currently lagging is a new methodology for composition. 

Before I tell you what to expect in the next post, let me tell you what not to expect. I will not be discussing the shortening of attention spans or the evils that screen time has wrought upon our eyes and minds. In truth, many of those arguments have been made before. Novels, in particular, were seen as social ills that harbored potential for weakening understanding. In fact, the tradition of railing against new methods dates back to the invention of writing. When Thoth came to the Egyptian gods to tell them of his revolutionary idea — an idea that would free humans to transmit their thoughts from one to another in space and in time, the other gods objected, noting that writing would come to destroy human memory — much in the way people now mourn their inability to remember phone numbers now that our smart phones remember them for us.

What I will be positing is a materialist argument: That writing on a computer/mobile device screen has significantly changed the model that we use for creating arguments and composing the form they take because writers have moved from an architectural model of production to an agricultural form of production.


For those non-English majors reading, the Middle English spoken by Chaucer, a Londoner, was noticeably different from the Middle English of the English Midlands spoken by the Pearl Poet, the unnamed author of Pearl and Gawain and the Green Knight

  1. The manuscript of Beowulf, for example, is one long string of unpunctuated text. It was assumed a highly educated reader would know where one word ended and the next began.
  2. William Shakespeare, famously, signed his name with different spellings at different times. This was not a sign of a poor education. It is a sign of a period before any dictionary recorded standard English spelling.
  3. It is worth noting that this example carries within it another example of the change being discussed. William Shakespeare did not write for a reading audience. He wrote to be heard and changes in pronunciation have obscured jokes and doubled meanings.
  4. Given what will come later, it is worth beginning to use the technical term for the physical form that books take — leaves of paper within two protective covers.

Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

Instagrammar

For a good while now, I have been trying — and failing — to get the hang of Instagram. I hasten to add that my ability to wrap my head around Instagram equally applies to Snapchat and any other photo-based social network. The issue isn’t the interface or the filters or controls or anything else about the usability of the app. 

I don’t speak the language of Instagram but I am learning to understand it.

I consider my inability to speak this language a problem. 

In one of the unrecorded discussions of our most recent Mellon funded Summer Institute, Fraser Speirs pointed out that if there was a theme to one of our day’s (or days’) discussions, it had to have been Instagram and how our students had begun to gravitate towards it as a communication platform and social space — one that we should understand and learn how to use to reach them and, when appropriate, to bring into our classrooms. 

I had looked at Instagram (and Snapchat) in the past and knew several people (including the always au fait Jemayne King — check out his “Meme You No Harm” talk) who had made the jump to photo-based platforms. So I dusted off my account[1] and tried once more to navigate the platform. 

I still don’t get it. That said, I have learned a lot from those who do. In my attempt to “get it”, I selected people to follow from among former students, some current colleagues, and some artists (mostly dancers, photographers — both professionals and serious hobbyists[2], and sculptors) of my acquaintance. I also added in some organizations I respected highly, including NASA

I cannot say that there is a one-to-one correspondence between artists and high-quality Instagram accounts but the odds will ever be in your favor. Their eye has been trained to follow the phenomenal world and either capture compelling images or know when one has been captured.

What I found particularly interesting was what NASA was doing with Instagram Stories. I had initially followed them for their collection of beautiful space imagery. They let me go, at least imaginatively, where no one has gone before. Those interested in the application of technology in the classroom, NASA’s stories are worth considering carefully.

For those unfamiliar with stories, they are a series of still images and videos that a user links together. In theory, they tell some sort of story — even if it is as simple as “Look at how my day went.” Those teaching narrative in Creative Writing and Photography classes should take note of this. It is a way of building narratives within a means of distribution.

NASA is using these stories to create self-continued mini-presentations on their missions or a space science topic. They are well put together but, as is appropriate to the medium, they are rough cut rather than highly polished artifacts involving advanced post-production. 

They are a model for the rest of us. After all, these are people explaining rocket science to those that aren’t rocket scientists. It is the kind of thing that any academic could use to illustrate self-contained, foundational items from their field — the elements that we so often despair that our students don’t come to us having learned. An English Professor, for example, might put together a series of stories that explain the parts of poetry. One story could focus on the iamb followed by another on the trochee. Or, perhaps, a series that explains the comma and how it is used.

The stories may be ephemeral, but there is no reason that they could not be re-released as reruns by a sufficiently enterprising individual.

Now, if someone could put together one covering the language of Instagram, I would appreciate it.

 


  1. Those of you who are curious should read the privacy policies and understand who owns what before jumping into these sites. Those who have developed an antipathy towards Facebook, for example, should understand that Facebook owns Instagram. For those who have bought in to the Facebook platform, however, this will come as great news because the two services integrate easily.
         I don’t mention this to be an alarmist but because users should remember that free services are paid for somehow and that others might try to unscrupulously profit from their work and/or family photos. Look at your privacy settings and make sure they are set at a level that you are comfortable with. Also, be sure to understand that the deal you have made about the pictures you upload could change as companies are bought and sold.
     
  2. I chose Andy Ihnatko and Scott Bourne’s accounts to use as examples here because not only are their accounts public, they regularly refer people to them on podcasts. There are many others whose names (e.g., https://www.instagram.com/jbphotography2016/) could easily appear here.

Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

An Always Engaged Audience

Some background: For those in other parts of the world who may not know, this fall has been unusually rough when it comes to the bugs that have been floating around here in North Carolina. Like most parents, my wife and I have a working arrangement — subject to change based on the needs of the day — as to who will stay home with our daughter when she has to stay home from school. Since we are both professors, this pattern generally aligns with the Monday-Wednesday-Friday (MWF)/Tuesday-Thursday (TR) split. 

This semester, I have the MWF shift. This is counterintuitive, as my classes this semester are on MWF. She, however, has classes with more in-class assignments on MWF than I do. As a result, my MWF schedule is more open to alternate approaches.

I have been using Periscope to stream classes when I stay home. It is an imperfect vehicle for what I am attempting[1], but it gets the job done. 

In a strange way, I have noticed that I find it a surprisingly comfortable experience to hold class via Periscope. And after class today, I think I have settled on why this is. 

For those of you who have never taught, facing a room full of students can be a depressing task. I know that my students are more engaged than they look. Their questions and comments have, on more than one occasion, proved that just moments before I was about to succumb to despair. But if you know the semi-blank look that people assume when they watch television, you know what you will see looking out at a room full of students. Not all of them look like this, of course. Some are more animated and some are less. Nevertheless, there is a passive look that pervades the room. This can be true with the most engaged of students. If one is taking notes, for example, you do not get to see the animation in their face because they are looking down.[2]

When you broadcast on Periscope, you look at yourself. It is a feature that lets the broadcaster know what his audience is seeing. So, when I am talking about Mark Twain and H. G. Wells,[3] I am looking at someone who is actively engaged — not a classroom of students who are paying attention and trying to process what is being presented or discussed.[4] 

I know that when I present, I feed off of those who are actively engaged. Most people in front of an audience do. When that is happening, I feel like I am doing a better job (Whether I am or not is a different question.). With Periscope, I provide myself with a positive feedback loop.

As a result, classroom performance, in the literal sense of the nature of what is presented rather than its content, could (Let me stress: could.) improve on the in-room experience with access to this technology, if it can be successfully linked to a mechanism for student participation, as discussed in footnote one below. It might also be worth considering and weighing for those running experiments with classroom delivery, as can be seen at Minerva University or through on-demand services like Kahn Academy.  


[1] If you want to see what can be done with a streamed class that functions very much as an interactive seminar, I would highly recommend that you tune in to one of Signum University’s open classes. You will find me sitting in on “Exploring Lord of the Rings” most Tuesday evenings, beginning at roughly 9:30 PM Eastern. Professor Corey Olsen simultaneously broadcasts via Periscope/Twitter, Twitch, and Discord while being “present” in Lord of the Rings Online. The online version of Middle Earth allows for the classes to take field trips to locations of note every week. (The broadcasts are then made available on YouTube, as can be seen in this randomly chosen example.) He manages to juggle three chat areas (Discord is the primary location for the comments and questions.), where people ask questions and offer comments.

Since this blog is about the iPad in the educational space, I will let you know how I attend. I run  Twitch (which contains the video and audio stream I use) and Discord (where I am present in the chat) in split screen mode on my 10.5” iPad Pro. I find it quicker to type my comments on the Smart Keyboard but often use the onscreen keyboard. To get a fuller picture and sound, I AirPlay the Twitch stream to my Apple TV. That, however, is a creature comfort and allows me to avoid resizing the split screen view to see more of the slides presented on the screen and/or type. It is possible to have a decent experience doing it all on the iPad.

[2] In case you are a student and are wondering, your professors can tell when you are writing about their class rather than another. The rhythm of your engagement in the class and the engagement with the page are either aligned or are not.

[3] Today’s 9AM class, which is on the way people understand time and how that is expressed in art and society, wrapped up A Connecticut Yankee in King Arthur’s Court and started The Time Machine.

[4] For those who might want to offer a well-meaning critique here and talk about active learning, let me offer a quick distinction: The look on a student’s face is a function of what they are doing rather than the pedagogical structure they are existing within. The same look will pervade on the faces of students during the kind of discussion or activity you might suggest I try. There is a material difference in the look worn by a student when they are “on” — when they have the floor or are talking — and when they are not, whether they are in a lecture or working in a small group. As a practical matter, it is impossible for everyone to be fully active at once. It is a question of how often they are fully active, how often they are partially active, how often they are passively active, and how often they are disengaged.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

140 Characters of Outrage or 140 Character Koans

Andy Ihnatko’s recent comment about the release of the iPhone X is a great example in the potential held within a Tweet. 

IMG_0003.PNG

There is little to disagree with here. Everyone would and should have their soul’s dignity respected and held sacrosanct, whether it is by others or by ourselves. But his statement has an edge to it — one that cut many, myself included. Based on the 126 and counting responses to the Tweet and/or replies, I was not alone.

Yes, I felt lucky that I got an early date on my upgrade to the iPhone X (Being sick, I slept through the 3 AM Eastern launch time.). But I have had to wait for technology upgrades before and did so with a shrug. I considered responding to the tweet in order to make the distinction between the sense of feeling lucky to get an early date and the kind of toxic need to be a first-day adopter. 

Then, I saw the outrage from those who took great umbrage to a statement that safely true.

I won’t get into the details of the responses (You can read them for yourself, if you wish.) as I don’t feel comfortable speaking for others in regard to what they felt and why they felt it. Their outrage, however, triggered a need in me to confront my own response. While I don’t think my relationship with technology is toxic (I can stop any time. Really.), my need to justify myself to someone who does not know me from Adam certainly signals something — something more than acknowledging the false intimacy that social media can sometimes breed. And while my wife’s occasional jokes about my relationship to my iDevices being a little to close to Gollum’s to the One Ring might be overstating it a bit, the cutting edge is still there. 

Ultimately, I do think that the technological revolution offered by these devices is more democratizing and liberating than they are binding. I am able to stay in closer touch with people I know on multiple continents than I would have at any time in the past. Yes, it is true that, in centuries past, there were great letter writers who had such correspondences — ones that were much more intimate than seeing a picture float past on a social media platform, but those people tended to have access to servants to handle things like getting dinner ready. Now, such connections are available to anyone who can afford the technology and the fees required to connect to the internet[1].

No, I am more concerned about our collective reactions to the koanic nature of Twitter and, to a lesser degree, other social media outlets. Koans are difficult and knotty things designed to make one reflect deeply on the nature of self and one’s relationships to the universe. While I don’t think most tweets rise to the level of koans, they do touch a similar nerve in us. It is our reaction to them that is telling.

Ihnatko’s statement is not a judgement of any individual and he has repeatedly said that there is nothing wrong with wanting an iPhone X and taking pleasure in getting one. That did not stop me (and others) trying to justify their feeling of good fortune at getting it sooner rather than later and cause others to insult and demean him. 

I would submit that this tells us a great deal more about those of us responding (Let’s face it: The fact that I am writing this means that I am responding at much more length than those on Twitter.) than it does about Ihnatko or the iPhone X. 

This soul-searching is probably more than a little necessary. The technological revolution we are experiencing will likely be looked back on 500 years from now as a Renaissance that dragged us out of a kind of Dark Age that began sometime around 1914[2], if not before. It is changing the world as significantly as the introduction of movable type to the West in the mid-1400s and it is doing so more quickly. We need, as individuals and as a society, to puzzle out how we will relate to these devices, given how intimate these relationships have become. 


[1] These costs may not be trivial, but they are less expensive than maintaining an aristocratic lifestyle.

[2] Another potential date would be 1789. In cases like this, however, it is best to leave it to the future. They will have a much better perspective on things.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

A Tip: Presenting with an iPad

About a year ago at a conference, someone looked at my iPad just before I got up to present a paper.[1] That what I was doing came as a surprise made me think I should pass this tip along to readers here.

This is a screen shot of the paper I recently submitted for the conference proceedings[2] of the Yeats Society of Korea's 2017 International Conference on W. B. Yeats and Movements in Literature, Art and Society in Seoul:

IMG_0237.jpg

And here is what appeared on my screen while I presented.

IMG_0236.jpg

It doesn't take a rocket scientist to increase the font size on an electronic device. What it does require is for us to not loose sight of what we can do with a digital-first document. It is easy, after all, to get attached to the thought that the time of the presentation is tightly tied to its length.[3] But once the paper is completed (Well, as completed as any presentation draft gets....), we are free to change its appearance to suit our immediate needs. 

The sharp eyed of you will notice that these are two separate files. I use a duplicate of the completed draft because it invariably needs editing to account for the fact that you can do things in writing that you cannot in speaking -- and visa-versa. Long sentences, for example, can be complex on the page or screen (see footnote one below) without risking losing a reader in a way that they cannot when a speaker encounters a listener.

Using a synched electronic copy also means I have a backup. If something goes wrong with my iPad, I can pull out my phone and access the file. It may not be as convenient but it sure beats having to try to receive your paper from memory.

Incidentally, this approach also works with paper printouts and on laptop screens. These methods have some drawbacks, of course. Printing in a larger font means more pages and an increased chance of the pages getting shuffled (I always make sure to have the page number formatted as "X of Y pages".) and laptops are more awkward while standing at a podium and it is not as easy to scroll through a document while presenting as it is on an iPad. These are, however, things that can easily be worked around if you haven't jumped on the iPad bandwagon.

---------------

[1] For those outside of the humanities who may be more used to other methods of presenting (e.g., poster sessions), I generally give a 20 minute presentation when I attend a conference as part of a 90 minute long panel. Any time that remains after the three presentations (additional time is eaten up by introductions, people getting up and sitting down as one speaker makes way for the next, people running over their allotted time, and the like) is a Q & A and discussion period. Immediately following this, everyone rushes for the bathrooms and/or the coffee station.

IMG_2106.jpg

[2] One of the nice things about the Yeats Society of Korea's conferences is that the proceedings (seen below, with a pen added for scale) come out before the conference and are distributed to the attendees for their use during the event. As such, we all have the papers in front of us and can make notes in them as we listen. This past year, we also received it electronically and I was able to use Goodnotes to annotate the document.

[3] For those who have not done this regularly, a twenty minute paper is roughly eight double spaced, 11-12 point pages long. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Tyrannies we Falsely Blame on Technology

As this occasional blog may indicate, I am far more technophile than technophobe. Indeed, I prefer to embrace titles like “technophile” and “early-adopter”(^1) to stave off alternate images that would align my behavior with addictions — a case of Star Trek’s holodiction writ small — or as an embracing of wish fulfillment/delusions of grandeur of my being a superhero like Batman, with his voice controlled Batmobile and other wonderful toys, or Tony Stark with his computer assistant Jeeves.

I can stop any time. Really.

My predilection for technology has led me to think about technology and its use in the classroom on more than one occasion. Indeed, a search of iTunes will yield four Summer Institutes(^2), generously funded by the Andrew W. Mellon Foundation, that I organized which focused on Technology and New Media within the academy. Most of my personal focus has been on how the small, incidental uses of technology can improve the life of a faculty member and the experience of students in the classroom rather than on large scale initiatives — how a service like Periscope, for example, can come to your aid when you have to stay home with a sick child as opposed to an analysis of how to roll out online learning campus-wide for all faculty and students. I know people who do the latter (and try to work closely with them) and respect their work.  It’s just that is not where my active interest currently lies.

As with all things, both levels of interest in ed-tech run the risk of losing sight of first causes — the underlying assumptions and needs that drive our decisions. Recently, it took me a surprising amount of effort to trace back to first causes my discomfort with a story that, when I read it, I thought I should be excited by. Union County Public Schools in North Carolina (I am pleased to say my daughter attends a school within this system.) published a piece well worth reading on how Vinson Covington, the AP European History Teacher at Parkwood High School, was getting his students to create a mobile app as a vehicle for learning about history.

Before I go any further, I want to make one thing clear. I think this is a fantastic, inventive idea and that Covington should be applauded for his work and for creating an environment where his students are engaged and are challenged to think about the subject differently. Nothing that follows should be seen as taking away from this, my personal and professional (for what little that is worth) assessment of what he is doing or take away from my hope that I see more teachers doing cross-disciplinary and interdisciplinary work like this at all levels of education. It is absolutely critical for all of our futures.

But as I was writing, I read this article and knew I should be interested an excited by it. Instead, I found myself disquieted. My first response to this disquiet, which I shared on Twitter, was that I would have felt better if it was part of a team-taught course, where the coding and the history could both be more fully explored by the students. And while I still think that, I no longer believe that is the source of my disquiet. Team taught courses are great but, from a staffing point of view, only occasionally practical. The kind of thing that Covington, on his own initiative, is doing here is a solution to a real zero-sum game that administration plays when trying to best deploy the limited manpower available.

Ultimately, I believe the source of my disquiet is the underlying assumptions about which disciplines should make space for others and how that space should be created. Those assumptions are building a hierarchy that many insist does not exist — even as they participate in building and reinforcing the hierarchy. 

In Covington’s case, there is no sense — even in my mind — that it is wrong for history faculty to introduce a coding project into their classroom. Indeed, I remain in awe of what Mike Drout and Mark LeBlanc accomplished and continue to accomplish with their Lexomics Project and know that I need to find the time to use their tool to satisfy some of my own idle curiosities.

To illustrate my concern, consider how non-English and Language Arts faculty react when they decide that their students cannot write well. They turn to the English faculty and ask why they have not taught the students better and look to them to provide solutions. There is no perceived cultural pressure on the non-English faculties to introduce writing into their areas in the way there is to introduce coding into history, as in the case of Covington’s work.

And I hasten to point out that the kind of cultural pressure I am pointing out is not just negative pressure. Covington has been singled out for praise here for his innovation. Can you conceive of an article that praises a member of a Biology faculty for having Pre-Med students write sonnets to improve their writing and interpersonal skills? Can you see that article treating such assignments as anything other than silly? Or as a waste of time that would be better spent on the difficult subject matter material that students are perceived as needing to cover to succeed in medical school?

And yet, no one will deny that understanding how World War I started is easy or unimportant. After all, understanding a highly mechanized, semi-automated system of distributed but telegraphically linked command posts with a series of standing orders that, once begun, cannot be stopped without destroying the system (i.e., the Schlieffen Plan) might be analogous to our contemporary computer-controlled military systems might be what prevents World War III. And learning about sonnets and people’s emotional reactions to them might help a doctor have a better bedside manner or a sufficiently greater sympathy with a patient that lets them notice that, despite the glitter of their walk, their patient may need help. It might help those employed by insurance companies see less of the paperwork of procedure and more the people trapped within the system.

Innovation, then, must not be seen as a one to one correspondence with technology, science, and engineering. Innovation is when we take new ideas and apply them in any field. The unfortunate truth about the way we are currently recognizing innovation in the Academy is that we have tied it too closely to the use of technology — so closely that we can no longer see when innovation is taking place through other areas. This matters not just for humanities faculty who might fear they are becoming second-class citizens within their own disciplines. It also matters to faculty innovators like Brendan Kern, whose podcast on the life of an imagined exoplanet can teach students about biology through the exploration of an alien, new world. Such work is currently more likely to be advertised as “fun” or as a bit of fluff rather than a serious attempt at pedagogical development and innovation that might make material accessible to students.

Whether we in at all levels of the Academy choose to see innovation more broadly than the infusion of STEM and its wonderful toys into other disciplines will determine how likely we are to promote and recognize real innovation across all disciplines. It will require challenging many of our assumptions about how we do things and how much of a king our disciplinary content actually is. It will be difficult for many of us to do this. After all, it is easy to give the appearance of innovation if you see people working on robots or a flying car. It is less easy to do so when you watch someone telling or discussing a story. But both of these represent the skills our students will need to be successful and adaptable in the 21st Century. We must, then, learn how to refuse to be hidebound.

——

 1. For those that are curious, I added the hyphen because of some research conducted by Sparrow Alden, who noticed that, in The Hobbit, J. R. R. Tolkien appeared to hyphenate certain two word phrases to indicate that they stood in for what would have been a single word in Westron (the common language of men and Hobbits in Middle Earth) and come down to us as kennings (https://en.m.wikipedia.org/wiki/Kenning). Although early adopter is not traditionally hyphenated and is not as figurative as oar-steed or whale-road, it is nevertheless true that being called an early-adopter signals more than just being the first kid on the block with a new toy.

2. Or you could follow these links:

 The First JCSU Faculty Summer Institute for Technology & New Media

The Second JCSU Faculty Summer Institute for Technology & New Media

The Third JCSU Faculty Summer Institute for Technology & New Media

The Fourth JCSU Faculty Summer Institute for Technology and New Media and Problem Solving in the Interdisciplinary Humanities


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Importance of Note Taking Apps

The GoodNotes app published a blog post about how Shirantha Beddag uses the GoodNotes app in her teaching.

While the blog post is worth reading for anyone who teaches music, it is also good reading for the rest of us. GoodNotes (and similar apps like Apple’s Notes and Notability) are foundation-level apps for those of us in education. As such, they run the risk of disappearing into the background. 

But these note taking apps are the kind of things administrators, faculty, staff, and students alike will use day in and day out. For that reason, it’s important to find an app that fits your needs. And, quite frankly, they are one of the biggest reasons to go with an iPad Pro and Apple Pencil combination. We may not all be ready to take advantage of the artistic potential offered by ProCreate. Everyone who has meetings inflicted upon them need a way to take notes.

For me, the reason that I ended up going with GoodNotes is its use of notebooks organizational metaphor rather than a system that hews closer to a computer file system, which is what is found in Notability and Apple’s Notes app. All three are strong, and there are others that are worth looking into. The strength of the App Store is that it provides options that users can weigh. For some students, the ability to record a lecture may be the killer feature, as opposed to GoodNotes’ ability to search handwritten text.

So how will you know, short of downloading them all and playing with them? Fortunately, someone has done that for you already. Serenity Caldwell has a great run-down of several apps on iMore. Also, ask around and see what your colleagues are using. I’ve noticed people who have chosen an app, rather than just going with a default, are happy to show you how their app helps their idiosyncrasies. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Apple Product No One is Talking About (and Why Educators Should Care)

In the space between iOS11 and MacOS High Sierra, I wanted to offer a few words not he big product Apple announced that (with some exceptions) has gotten little to no air time. 

I'm talking about the Apple Town Squares, formerly known as the Apple Stores.

It's worth listening to what Tim Cook and Angela Ahrendts talk about, in terms of the relationship of space and purpose, beginning four minutes and four seconds into the Apple Keynote of 12 September 2017. They spoke of the design philosophy behind Apple Park and the Apple Town Squares. Cook spoke about how Steve Jobs set out to "inspire talented people to do their best work" and that the retail spaces were designed to be "about learning, inspiring, and connecting with people" as much as it was about retail. Ahrendts spoke about the redesign of the flagship stores to serve as "gathering places for 500 million people" and that they are "Apple's largest product".

If you listen to the way they spoke about Apple's facilities, they can easily describe some of the central aspects of a University. In case that is less than immediately apparent, here are some of the parallels between Apple's facilities and a university's -- if the clearly intentional parallel was not driven home enough when Ahrendts linked the Creative Pros to the Liberal Arts and the Genius to Technology (STEM, for those attached to the current nomenclature of the university):

The Plaza = The Quad

The Forum = The Classroom

The Board Room = Library Study Spaces

The Genius Grove = Faculty Office Hours

The Avenues = The Student Union, with its opportunities for students

Her talk highlighted Apple's efforts to offer Lifelong Learning opportunities -- for free -- to its community. Those who are in the business of supplying such opportunities to potential students at local universities should both take note, as Apple appears to be doing this in a more compelling and targeted manner than universities are, and be alarmed, as Apple is beginning to eat the lifelong learning lunch.

There is more, however. Apple Park and Apple's Town Squares are described in a language that, as I suggested, parallel the university. Given that universities are trying to figure out how to bring students to their campuses in an age when, quite frankly, there aren't enough students to go around and online learning and similar innovations might keep potential students away, Apple's successful strategies should be examined closely by those teams trying to figure out how to differentiate their campus in a way that makes it a compelling destination.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

An Observation about My Glasses

So, a couple months ago, I had to replace my glasses (The prior pair's arm had broken in a manner that could not be repaired.). Having reached a certain age, I needed progressive lenses (I had them in the last pair, which means I reached that certain age a while back.).

Now, as this (intermittent) blog indicates, I went all in on iOS a long time ago. That said, I have a Mac Mini at home to serve as a base of operations and a Mid-2011 iMac on my desk here at work. One of the things that has troubled me about this pair of glasses is that the progression is off. In my last pair of glasses, I would look at the iMac's screen and all would be clear. Now, it is fuzzy and I have to tip my head back.

Just moments ago, I finally figured out why. The progression is not set for people who work on a desktop. It is set up for people who work on a laptop or a mobile device, like an iPad or an iPhone.

A number of the podcasts I listen to are populated with people I respect and listen to bemoaning about the death of the Mac in general and desktops in particular. Apparently, it is not just computer companies that have begun to adjust to a post-PC-of-their-memory/imagination world.

There is more potential angst to this than that supplied by aging tech pundits. This kind of change should be noted by the academy, which is a much more conservative and stodgy group than usually imagined. Perhaps we should begin to consider changing some of our ways if and when our glasses are letting us know that the world has moved on.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

What is So Different about the Three Current Categories?

There are currently three broad categories, each of which can be subdivided, of devices that users can now choose from:

Mobile: This category includes both handhelds (mobile phones and devices like the iPod Touch) and tablets. While these each solve a specific use case within the category, they all have mobility and a touch-first interface as their two primary features. They also provide their users with a task-based focus through the use of apps instead of applications.

 Strength: Portability of the device and the focus of the apps.

Needed Leverage:  An always (or almost always) available internet connection for extensive file storage.

Weakness:  Not as powerful, in terms of computing power and application flexibility.

Laptops: This category includes all devices that are portable but require a keyboard. Yes, they may have touchscreen capability, but the primary input is designed to be through a keyboard. They share with their desktop-bound counterparts an approach to tasks that uses application, rather than apps. This represents not only a difference in focus but also in the number of tasks that a user can pretend they are dong at the same time. While the latest iPads can run two apps at once, laptops and desktops can have multiple windows open at the same time.

 Strength: Balances the power of the Desktop with the Portability of a Mobile Device

Needed Leverage: Consistent access to power to recharge its battery.

Weakness:  Heavier than a Mobile Device.

Desktops: These devices are designed to stay in one place and provide significant computing power. Quite frankly, they provide far more computing power than most users need. More important for most users, these devices provide a significant amount of on-device storage for large libraries of files.

 Strength: Raw computing power and the capacity for a lot of local file storage.

Needed Leverage: An Internet connection for off site remote access when you are away from your desk and a device to access things while out of the office.

Weakness: Immobile and, for some, too many windows.

 

Seen in this light, the laptop, which was once the way to get work done on the go, is now a compromise device -- a role usually given to mobile devices (It should be noted that this role is usually assigned by people who use laptops.).

It is also worth noting that there is no longer a primary/base-line device for people to use. All three categories are viable computing choices for users. The question, then, is what does the individual user need at the time (given the constraints of their purchasing power). This is something that strikes home every time I come to Asia. You do not see people using laptops as a primary computing device. You see them using a larger mobile phone. It is a trend, incidentally, that educators should keep in mind, as students everywhere appear to be shifting toward this model and it represents a shift as significant as the move from punch cards to keyboards and text-based to GUI-based operating systems.

The importance of use case here is a critical one in education, which has three primary user groups: students, faculty, and administration/staff. Of the three, the individuals most likely to be tied to a desktop are the administration and staff, who work at their desk (when they are not in a meeting). Professors are semi-mobile users, as they move from their offices to a classroom. Students, however, are clearly mobile users. They move from place to place throughout the day, as they go from class to class. 

It is worth keeping this in mind when making decisions about the technology that students will be issued or be asked to purchase on their own. Mobility and portability, although they may not always recognize it, is an important factor in their interaction with technology. 


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

My Biases

The most important thing you should know about any reviewer (whether it is an individual reviewer like Roger Ebert or Andy Ihnatko or a corporate reviewer like the Sweet Home or Consumer Reports) are their preferences and biases. If you know these, you can gauge what part of their response to the thing under review you should listen carefully to and which part you should dismiss. I stress the you here because it is absolutely not the generic you. It is you personally. What I should ignore about and focus on in a review is not the same thing as what someone else should ignore and focus in on.

The iPad Pro

My first bias that you should know about is that I really, really like the iPad mini's form factor. I love how small it is, in terms of its portability and use. I don't find the screen too small to type on (because I have fully adjusted to it) and it packs easily into the slim line bag that usually serves as my briefcase.

As a result, the iPad Pro feels, even after several weeks of use, too big.

I freely admit that the size provides some great advantages. When editing text, a lot of screen real estate is nice -- especially when the iPad Pro is in portrait mode. The difference between using a note taking app like Notability or working with a PDF in an app like iAnnotate is the difference between using a steno pad and a legal pad.

If you travel economy, the screen size can be an issue -- as I will discuss in more detail in a later post. Once it is set up with the keyboard, it gets crowded. It is possible to use the onscreen keyboard lying flat on the fold-down table, but the angle is less than ideal.

And yet, as I type this and look at the screen, it still feels too big. I don't like that I am carrying around my old notebook backpack again (although its larger capacity is more than a little useful -- especially when traveling.) I had thought that I would have adjusted to both

the size and the bag more by now. Since I haven't, you should know that I appear to remain biased against the size.

The Apple Pencil

The Apple Pencil is a game changing device. I was not prepared for how much of a difference it makes even when compared to some very good styluses.

From the moment I figured out how to get a paper onto my iPad for marking, I have been using a stylus and I have found two that are -- at least for me -- worth using.

One is Adonit's Jot Pro. The illusion that the clear Precision Disk creates means that I feel like I can be much more precise in my lines. The first generation does have a problem, in that the nibs do eventually wear out and, once that happens, it quickly becomes unusable. The company does offer replacement nibs for purchase and the second generation of the stylus, it is said, has improved reliability (I haven't used it, so I can't comment on whether this is true or not.). Still, it is well worth using and I still have a Jot Mini in my bag.

The other is Applydea's Maglus stylus. I initially backed this as a :fund:it project (an Irish crowdfunding site) and have not regretted it. Before the Apple Pencil came in, I replaced my original, which had developed a small tear in the nib after many years of use, with a next generation model. The replaceable nibs are a nice feature and I am really impressed with how much of an improvement the microfiber nib makes, in terms of the feel. I would like to tell you about the graphite nib, but it was on back order. Whatever happens with this test, I expect that the graphite tip may mean the Maglus is in my bag even if I have access to an Apple Pencil. As I am not an artist, I am not quite sure what to do with the brush nib -- although it is cool to look at and fun to play with.

Even though I still recommend these two styluses, I am amazed by the feel of the Apple Pencil. When I now electronically sign a PDF, I no longer zoom in on the signature line -- as I do even with the high-quality Adonit Jot and Maglus. I can just sign like I would if I was working with paper. The same is true with note taking using Notability. I used to use the magnified area at the bottom of the screen to write my notes (Incidentally, cursive is much easier to write in than printing on a glass screen. Given the choice, I tend to print when taking notes on paper. On the screen, I almost exclusively write in cursive for the speed and for the fewer clicks it produces as the stylus hits the screen.) For an art app like Paper by 53, it is stunning.

I mention this because I did not expect to be overwhelmed by the Pencil. It, more than any other single thing, is what has made me think I should conduct this test.

The Apple Keyboard

I am troubled by the keyboard.

I am not troubled by its design or its keys. I know that internationally beloved technology columnist Andy Ihnatko does not like the feel of the keys. Like iMore's Rene Ritchie, I find them very usable.

Indeed, I think they may be too usable.

I cannot shake the feeling that all external keyboards are legacy input devices. It is something that watching those who use their mobile phones to text and search brings home. And while the individual user choice is, and should be, a matter of personal preference, it is something that educators need to consider. What should we use, model, and expect students to use as they are positioned for a future that not only includes next year but ten years from now. I cannot shake the feeling that in ten years, when my daughter heads off to university, she will not be using an external keyboard. And while it is true that I want her to have every advantage possible, it is equally true that my current students will eventually be her competition in the workplace. And if they are wedded to legacy devices, they will be at a clear disadvantage.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.

The Experiment

I have been at Guandong Baiyun University for four days, giving talks and doing some set-up work for the Guandong Baiyun University Center on American Culture and Race -- which, with the support of the US Embassy in Beijing, Johnson C. Smith University is establishing here with our partner. My set-up included preparing three iMacs, ten iPad minis, and ten iPod Touches for use by those visiting the Center so that they can listen to the podcasts and view the vidcasts and other material we will be uploading to promote mutual understanding between the US and China.

During the time I have been here, I have been using an iPad Pro. That is why I am writing this blog.

Traveling here with the iPad Pro is part of a larger experiment. In brief, we wish to determine if it can replace a faculty member's desktop computer for a month (Dec. 15 to Jan. 15 -- a time frame that includes travel, a break, an advising period, and, of course, teaching) and to document the successes and the pain points associated with such an experiment.

Yes: desktop.

Quite frankly, much of what is written about whether or not an iPad is capable of replacing a laptop is reductive. The iPad has been able to replace a laptop for several iterations. Indeed, I ceased to use a laptop computer soon after the first iPad was released. Much like Serenity Caldwell did recently during her iPad Pro Experiment, I closed my laptop for a week and tried to see if I could successfully complete what I needed to do with just the iPad. I have written and edited full length articles on iPads for years (You have actually been able to write articles on an iPhone using only the Notes app for a long time. The screen size just makes it inconvenient. Inconvenient is not the same as impossible. After all, the screens on the typewriters that came out at the same time as the early PCs didn't have a screen that could show nearly as much as today's phone screens.)

There was, of course, a learning curve. I discovered, however, that the overwhelming majority of my tasks could be completed on an iPad and that half of those I could not complete on the iPad wear do to artificial constraints imposed elsewhere (IT staffs have since come around to supporting mobile-centric computing. Indeed, many have embraced it with a fervor that equals or exceeds my own.).

I also learned that many of the perceived constraints of the first generation iPad had less to do with the device and more to do with me. Typing on a glass screen was initially alien. After a week, however, it was normal and physical keyboards felt strange. Yes, the virtual and physical keyboards were different but my initial hesitations had to do with adjusting to what was new rather than what was better or normal.

Disclaimer: While I am not a slow typist, I do not move at the speeds of many professional writers. I never took a typing class so never learned to touch-type. For some, the changeover to a glass screen from a good keyboard involves a noticeable degradation in typing speed. For those whose livelihood depends on the number of words on the page/screen, it wouldn't be a good idea to threaten one's livelihood with retraining. For future generations, however, it is worth remembering the angst that accompanied the shift from Typing 101 to Keyboarding 101 in the 80s and the questions about speed and appropriateness of changing what is taught to students. Because, after all, there were going to be typewriters in offices for a long time and not everyone would have access to computers.

Whether an iPad can replace a laptop is a question based on a series of false assumptions and comparison, as was humorously demonstrated by Fraser Speirs' review, asking if the MacBook Pro can replace your iPad. Most comparisons are not as fair as this one, as they tend to compare a single product with three primary expressions (the iPad mini, iPad Air, and iPad Pro) to a class of products (the MacBook, MacBook Air, MacBook Pro, to use Apple's line of laptops as a point of comparison). There is a huge range of capabilities between these three laptops and, for some users, the iPad mini, the MacBook, or a Chromebook are equally unusable because they need the features available at the Pro level.

With iOS 9, the iPad Pro can, in most cases, easily replace a laptop -- as Speirs has outlined on his blog. The obvious next question is if the iPad can replace a Desktop. Of course, given that laptops can replace desktops, Speirs' experiment could be said to have already been completed. But technology usage, like all politics, is local. Can his experiment be completed here?

And, as with most things, the only way to discover is to do.


Dr. Matthew M. DeForrest is a Professor of English and the Mott University Professor at Johnson C. Smith University. The observations and opinions he expresses here are his own. You are very welcome to follow him on Twitter and can find his academic profile at Academia.edu.