Blog

Literary Language

July 13, 2018

Thanks to Alissa Simon, HMU Tutor, for today's post.

I am interested in the way(s) in which literary language intersects with language itself. By literary language, I mean language that most often occurs in writing, but not necessarily in everyday speech. A marked difference between the spoken and written word of a culture represents diglossia. In other words, a culture which has a high level of diglossia has evolved language into two distinct functions: written and spoken. The idea of language, then, expands from a system of communication to a variety of expressions specific to a situation, but inappropriate in other situations. Utilizing the wrong language style, then, may lead to misunderstanding. Cicero presents one example of diglossia. He wrote in an elevated, stylized Latin which was not common in everyday rhetoric. Ferdinand de Saussure formalized the idea of language as separate from speech in his structuralist formula. In a nutshell, he claimed that “langue” (which roughly translates to language) represents the totality of imaginative language (including grammar, etc.), whereas “parole” (which roughly translates to speech) is a concrete formulation, such as speech or writing. Langue opens up potential, whereas the latter is an actuality or action. In focusing upon the way a single society uses language, one can develop a better sense of the society itself.

During the Middle Ages, England experienced a number of language changes. Chaucer, for example, had to navigate a tri-literate system of French, Latin and English. Chaucer worked in the court and therefore, dealt in French. His education and writing career demanded the use of Latin. And, of course, he wrote in a vernacular English which had not been done before. His lifestyle at court and working with tariffs enabled Chaucer a rare view of life, one in which he met many people. He reflected the great language changes of his time in his writings. Chaucer incorporated French, Latin and English (both grammatical constructs and words) into his writings. Furthermore, he wrote in dialects at a time when dialects were beginning to disappear. Speech from the north of England altered in different ways than the south. The Canterbury Tales present a diverse set of speakers, which demonstrates his abilities in both observation and skill at characterization. Strictly speaking, he combined both langue (potential speech acts) and parole (actual speech acts) in order to create believable character traits. In order to do this, Chaucer combined and played with rules from common speech styles, including Old English, Latin and French.

Old English accumulated terms from Germanic and Scandinavian languages. As French became the language of nobility, it also filtered into daily life. As universities arose (Oxford and Cambridge among the first around 1200), scholars and scribes began to unify spelling. Simultaneous to spelling and grammar formalization, English began accumulating foreign terms. Chaucer noted these changes in his tales. This marks a transition from Old English to Middle English. Some scholars, however, disliked the palimpsest-like style of Middle English. Alexander Gil, a prominent teacher of the 16th century, reinforced the idea that language should be pure. He published a text on the purity of English, which, ironically, he wrote in Latin. (It is notable that John Milton was one of Gil’s students).

Rarely does everyday speech take note of grammatical rules, however. Languages and dialects flow together altering grammar in unpredictable ways. One of the things I love about Old English is the way that it creates compounds. Often two words were thrown together in a sort of metaphor, which resulted in a single, new term. So, for example, an idea like wīdwegas is actually a combination of two previous terms. It compounds wīde, which means “far” or “far and wide”; and weg, which means “path, road or way.” The combination, wīdwegas, translates to “distant regions.” However, as other languages began filtering in, particularly French, English slowly absorbed a lot of foreign terms into its lexicon. So, while Gil did not appreciate language change, Chaucer did. Chaucer recognized the ways in which words are formed and imagined how the speakers in his tales would actually speak. This trick allowed him to develop excellent and believable characters.

Sometimes, however, a term is considered pretentious and speakers refuse to use it. In the late 15th century, so many terms were being produced that they became known as “inkhorn terms.” In other words, they were something that writers used, but were not necessarily a part of common speech. Inkhorn terms, coined by Thomas Wilson in 1553, often combined Latin or Greek roots with a variety of prefixes and suffixes to form a fancy, and often pretentious sounding new term. (Inkhorn refers to the writer’s inkwell. Therefore, inkhorn words pertained more to the written word than spoken.) There are any number of imaginative and hilarious combinations which have fallen out of use. (Find more links for inkhorn terms at the bottom of this blog). It is interesting, though, that this style of writing has also given us some useful terms such as autograph and meditate.

In short, I still wonder why some terms stay and some terms fade. When do we consider grammar to be proper, or forced, or affected? When is grammar natural or pure? How do we judge speech acts if not by our own rules, and when is it acceptable to break the rules? Does metaphor grant an aura of prestige to any given language (or language act)? Can we mix words from the Urban Dictionary, for example, into scholarly writing and have the desired impact? So while, Saussure claimed that langue was a private act and parole was primarily a social act, I wonder if there is more of an ebb and a flow than we realize.

For more on Chaucer, visit these past blogs:

http://www.hmu.edu/hmu-blog/2018/3/30/chaucer-translations

http://www.hmu.edu/hmu-blog/2018/5/4/translations-of-chaucer

 

For more on Language, try these blogs:

http://www.hmu.edu/hmu-blog/2018/5/25/caedmons-compounding

http://www.hmu.edu/hmu-blog/2016/7/1/etymology-of-independence

http://www.hmu.edu/hmu-blog/2018/2/2/william-james-and-the-stream

 

For more inkhorn terms, visit:

http://www.worldwidewords.org/articles/inkhorn.htm

http://www.macmillandictionaryblog.com/the-fashion-for-inkhorn-terms

http://campus.albion.edu/english/2012/11/06/the-inkhorn-controversy/

 

To leave a comment, click on the title of this post and scroll down.

Pleasures of Reading, Thinking and Conversing in Science Fiction Age

May 11, 2018

Thanks to Dr. John Reynolds, HMU alumnus, for today's post.

How malleable the notion of science fiction is! What strange places one ends up in when exploring such a seemingly simple question: "Is Star Wars science fiction?" The question grew out of reflections on and discussions about Alissa Simon's blog post “What is Science Fiction” from April 27, 2018. Originally, I planned on exploring important differences between science fiction and fantasy, and I thought that Star Wars would make an excellent cultural artifact for further conversation, especially with the approach of Star Wars Day (May the Fourth Be with You) and a stand-alone Han Solo movie arriving in theaters near the end of May.

I enjoy the passion found in diverse commentators on science fiction who disagree on the classification, value, and influence of Star Wars. They form a community as diverse as the vision for the Star Wars universe. Some find the films and franchise a threat to the genre of authentic science fiction and a disintegrating influence on culture. Others find it part of a benign or even beneficial paradigm shift in our cultural habits concerning narrative, entertainment, and culture. Some scholars and fans make strict distinctions between hard science fiction and soft science fiction. Some adamantly refuse to acknowledge Star Wars as science fiction, citing numerous scientific and technical deficiencies, while others find a home for it in the category of soft science fiction. Those who commend the soft science fiction of Star Wars tend to align it with the ongoing idea of myth. Such mythic identification links the characters, plots, and themes with ongoing archetypes that continue to fascinate human beings across time and cultures. In an older online posting found on The American Prospect, Cara Feinberg captures this sense of interest while exploring the question "Is Star Wars Art?" She explains how the 2002 Brooklyn Museum's presentation of Star Wars: The Magic of Myth "examines the mythological roots of the now legendary film saga that explores themes of heroism and redemption and the triumph of good over evil through the creation of characters that exemplify chivalry, nobility, valor, and evil...." Likewise, I recall Joseph Campbell making such claims while being interviewed by Bill Moyers about the power of myth and the hero's journey in the late eighties.

A few tangential opinions about science fiction provide additional insights about fans and science fiction that go beyond limited concerns involving just Star Wars. Along with the exploratory and predictive functions of science fiction, Jason Sanford asserts that it actually helps create the future, as he winsomely explains how those techies who brought us the Motorola flip-phone were clearly Star Trek fans. In a style reminiscent of Jeff Foxworthy's "You might be a redneck if..." comedy, one interesting post describes "11 Habits That All Sci-Fi Readers Have In Common," ranging from "[l]ooking for the real science behind the fake science fiction," to "[c]orrecting people on the differences between sci-fi and fantasy," and “[c]oming up with plans for when the aliens arrive". A formal study of reading habits suggests that the genre of science fiction texts may entice its readers to be less skillful interpreters of texts. I suspect that the potentially bad influence depends much more on a given reader's willingness to read any genre thoughtfully. Although my sample size is relatively small, I have known several high school English students who are as critically adept at analyzing Austen and Shakespeare as they are at evaluating android and space stories. Is such science fiction a foe to those of us who deeply value the Great Books and Great Conversation traditions? I think not. When I think of how much one of my current students enjoys discussing traditional literary texts alongside science fiction stories, I am inspired to assert, "It is a universe truthfully acknowledged that technological, sociological, psychological, and spiritual forces need careful balancing."

An even more extensive demonstration of discussing science fiction thoughtfully comes in Adam Roberts' The History of Science Fiction. Roberts carefully examines the contemporary popularity of science fiction and offers a strange point of origin for it in the Protestant Reformation: Adams asserts that his "core argument is not just that SF begins out of the Reformation; it is that the fierce cultural climate of that time shaped SF, wrote its DNA in ways that manifest substantively even into the 21st century." Roberts provides a striking contrast to the well-worn arguments about science fiction's origin in nineteenth or twentieth century. He notes that his own research that yielded his book's first edition led him to see science fiction

"as a distinctly Protestant kind of ‘fantastic’ writing that has budded off from the older (broadly) Catholic traditions of magical and fantastic romances and stories, responding to the new sciences, the advances in which were also tangled up in complex ways with Reformation culture."

As I reflect on his thesis, I cannot help but think of the root meaning of Catholic as "whole" or "universal." Roberts first provides a helpful summation of his view of a classic Catholic vision of human beings in relationship to the universe:

"To an orthodox Catholic imagination a plurality of inhabited worlds becomes an intolerable supposition; other stars and planets become a theological rather than a material reality, as they were for Dante - a sort of spiritual window-dressing to God’s essentially human-sized creation."

In contrast, he shows how he conceives of the Protestant Reformation vision:

"[The] cosmos expands before the probing inquiries of empirical science through the 17th and 18th centuries, and the imaginative-speculative exploration of that universe expands with it. This is the science fiction imagination, and it becomes increasingly a function of Western Protestant culture. From this SF develops as an imaginatively expansive, and materialist mode of literature, as opposed to the magical-fantastic, fundamentally religious mode that comes to be known as fantasy."

For me, this provides a powerful way for reading the texts of Francis Bacon and surfacing, not only his methodology, but also imaginative vision for scientific purpose. I'm finding motivation to re-read him along side of Dante to further explore these strange contrasts: a rather strong material-spiritual dialectic is at work in comparing these two authors. To clarify his personal position on these two streams of influence, Roberts also gently assures us that he does "not mean to suggest a priority of value or merit of one mode over the other," and that he equally enjoys reading fantasy and science fiction.

Clearly, there is much more to explore in Roberts' expositional history of science fiction, but it offers interesting connections for consideration about the nature and popularity of Star Wars and a host of other modern popular fantastical films. Roberts notes that "[t]he level on which Star Wars works most effectively is precisely as visual myth." By this, he suggests that the appeal of Star Wars and its legacy functions to give audiences a grand sense of imaginative connectedness to our ever-expanded sense of smallness in a really big universe - much in the way he envisions the Catholic imaginative tradition. In this line of thought, even more than the Reformation's break from visual and sacramental ways of imagining the world, our society's increasing secularization leaves many of us hungry for ways to re-enchant our connections to nature, the world, and the larger universe. Awareness of such hungers helps us appreciate Roberts' assertion that "SF is now the most popular form of art on the planet because it has colonised visual media." Star Wars was essentially the first film to break open and popularize this experience of visual myth. Even the current excitement about Avengers: Infinity War resembles the visual myth experience and can be traced back to the influence of Star Wars.

If I understand Roberts correctly, we benefit from becoming increasingly aware of how we get so enamored by the power of visual myth and large-scale spectacles because such self-awareness serves as an important part of understanding our collective and individual assumptions about our identities. Otherwise, we lose sight of many important not-so-visual concerns for pursuing human flourishing. Perhaps, this is Socrates with a lightsaber admonishing us to know ourselves? Consequently, many of the resources for sharpening our visions of the present and the future come from understanding the influences of the past more clearly and deeply, and we benefit from conversing about and reflecting on these influences. With a healthy dose of optimism, Roberts finds a glimmer of hope related to this concern as he opines that the two heroes of Star Wars: The Force Awakens "are, respectively, a competent and brave woman, and a man of Nigerian heritage," and that "[e]ven as it cycles through the comforting old tropes and features, this new Star Wars is proving what SF has always known, that this is a mode of art intensely hospitable to diversity." Indeed, from the urban centers to the outer rim of our society, many ideas related to Star Wars have some surprisingly powerful ways of sparking diverse and thoughtful conversations about past, present, and future visions of human flourishing.

“Difficult to see. Always in motion is the future.”  – Yoda

To write a comment, click on the title of this post and scroll down.

What Is Science Fiction

April 27, 2018

Thanks to Alissa Simon, HMU Tutor, for today’s post.

A couple of weeks ago, I had the pleasure of attending a three-day conference hosted by the Great Books Council of San Francisco. The event, which took place at Asilomar, offers four discussions focused on one play, one work of non-fiction, one of fiction and a handful of poems. The wonderful selections were further enriched through discussion. The selected fiction was Ursula K. Le Guin’s The Left Hand of Darkness. Though this story is set on a distant planet and in a distant future, I connected with many of the issues raised in the text. Personally, I feel that Le Guin brilliantly demonstrated what it is like to meet a culture very different from your own. It included political frustrations, tensions between genders, misunderstandings of all kinds, economic interests, and an epic journey. To be honest, the setting could be nearly anywhere and at any time because she addressed so many universal societal issues. However, in moving the narrative outside of “earthly” restrictions, Le Guin allows for dynamic debate, divorced from possibly hurtful particulars. So, in part, science fiction allows for an emotional detachment in a way that actual events or specific names and places would not allow.

More than that, though, science fiction allows the reader to examine the consequences of what we often call “progress”. In last week’s blog, HMU Fellow in Ideas, Matt Phillips wrote about one of the benefits of writing in a noir style. He writes: “Noir—as a genre and practice—provides an effective palette for drawing, defining, and collapsing contrasts. And contrast, on its face, is what disparity is—an ill-drawn, and often evil, contrast.” In a similar fashion, science fiction provides a palette for understanding unknowns. As science progresses at light speed, it moves far past the average citizen’s grasp. There is no way to keep track of each scientific study or each new technological platform. Science’s broad reach affects our daily lives in demonstrable ways, but more often than not, understanding a new device arrives as an after-effect. Regulators and lawyers struggle to keep abreast of changing technologies. We even struggle to name new technologies, which we often base off of natural phenomena such as “cloud computing” or “website”. The science fiction writer is tasked with thinking in terms of possibility. What if faster is not better? Or, what if faster is greatly better? What are the possible outcomes of gene-therapy? Or, who should have access to such potentially powerful tools?

Science fiction, however, does not predict futures. It simply explores them. In the preface to The Left Hand of Darkness, Le Guin writes:


“This book is not extrapolative. If you like you can read it, and a lot of other science fiction, as a thought-experiment. Let’s say (says Mary Shelley) that a young doctor creates a human being in his laboratory; let’s say (says Philip K. Dick) that the Allies lost the Second World War; let’s say this or that is such and so, and see what happens…. In a story so conceived, the moral complexity proper to the modern novel need not be sacrificed, nor is there any built-in dead end; thought and intuition can move freely within bounds set only be the terms of the experiment, which may be very large indeed.

“The purpose of a thought-experiment, as the term was used by Schrödinger and other physicists, is not to predict the future – indeed  Schrödinger’s most famous thought-experiment goes to show that the “future,” on the quantum level, cannot be predicted – but to describe the reality, the present world.”

In a recent report on genetic technology, NIH Director Francis Collins claims: “When something truly significant is discovered its consequences are overestimated in the short term and underestimated in the long term." Simply put, it is difficult for the non-scientist to understand the scope of the continual scientific evolution. Therefore, the science fiction writer designs and creates a thought experiment meant to discover potential outcomes. The evolution of science fiction seems natural to me, as humans progressively depend upon and live with technology. Excited by new capabilities, we are also curious about implications. For example, what if anyone could alter their own genes; should they?

In another article, new types of data are being used to identify poverty and restructure impoverished areas. I found it particularly interesting to see the diverse groups necessary to discuss such implementations. The article mentions:  “a coalition of homeowners, renters, people with the experience of homelessness, nonprofit developers, community associations, religious institutions, policy experts, and university faculty” who will discuss human rights issues in such a difficult transition. I can see how science programs (such as STEM or STEAM) benefit society. I also firmly believe that ethics courses are as necessary as science. I believe that this is another benefit of science fiction: it is almost a category between the two, uniting science with fictitious outcomes in which we can ask ethical questions. It is authors like Le Guin who question our understanding of progress. I do not mean to imply that we should not embrace technology or change. In fact, rather the opposite. I embrace science, science fiction and ethics as all necessary parts of my education.

In addition to excellent discussions, the group at Asilomar listened to a keynote speech by Corie Ralston, current director of the Berkeley Center for Structural Biology. Not only is she a scientist, but she also writes science fiction. She gave a wonderful presentation that included a short history of science fiction, but also a hopeful future for the genre. Her website describes a deep-rooted love of science, nature, creative thought and reading. She writes: “If writing is a way to emotionally understand our world, then science is a way to practically understand our world. And science is more than that to me: it is a way of exploring and creating and learning how to think. It uses the best of our human nature. It is a continual source of awe.” I could not agree more.

Thanks to the wonderful folks of the Great Books Council of San Francisco for welcoming me into their discussions!

To post a comment, click on the title of this post and scroll down.

Rethinking Invention

April 13, 2018

Thanks to Alissa Simon, HMU Tutor, for today’s post.

“The difference between the present and the past is that the conscious present is an awareness of the past in a way and to an extent which the past’s awareness of itself cannot show.” - T. S. Eliot

I used to work for a professor who would say: “Without the toaster, we’d have no computers!” Each invention brings about a whole new world of possibilities. The toaster may not resemble the computer, but they are stages on a continuum once seen at a distance. Of course, that is not apparent in the beginning of any invention, only hindsight provides that kind of perspective.

The first toaster came about in the early 1900s and even it did not resemble the toasters of today. The first toaster browned one side of bread at a time, requiring the user to flip the toast halfway through. And wouldn’t you know the invention that immediately followed the toaster? Presliced bread. In other words, the new product created space for another new product. This is not surprising, and in fact, seems to be an unwritten rule of invention. It is anyone’s guess which products will survive (like presliced bread) and which will fade.

Listening to Mark Zuckerberg’s Senate testimony got me thinking about invention in general. Zuckerberg has repeated that he did not know exactly what he was creating Facebook. I think that can be said of all invention. And if the inventor does not fully understand the capabilities and repercussions of their creation, imagine the public. We are left wandering behind in a variety of states of interest, desire, greed, paranoia and ignorance. Listening to the questions I had two thoughts. First: clearly there is a difficulty in framing the right questions, particularly about something so foreign to our own experience and training. And two: humans really do not understand these new technologies.

It is likely that all teenagers function on nothing less than three social media platforms a day. Maybe more. They may not be able to imagine a day when these platforms did not exist. But I think it is worth our time to offer some perspective on technology. For this, I thought it best to offer a very visual demonstration of invention, namely, the airplane. In 1903, the Wright brothers successfully flew the Flyer. It was not their first attempt at a plane, but it finally proved that humans could fly. Furthermore, they “discovered the first principles of human flight”. And of course, flight experimentation did not stop there. Nineteen years after the Flyer, Italian designer Caproni built the Ca 60, a prototype of a flying boat, intended for transatlantic travel. To look at it now, in retrospect, it looks like a science project (because, of course, it was). On its second flight, the Ca 60 crashed into the water and broke apart. Airplanes nowadays are sleeker, constructed from entirely different materials and a whole lot more sophisticated, but the builders learned a lot from these early experiments.

 Caproni's Ca 60 experimental flying boat on Lake Maggiore, 1921. Photo via Wikimedia Commons.

Caproni's Ca 60 experimental flying boat on Lake Maggiore, 1921. Photo via Wikimedia Commons.

That there were nineteen years between the first flight and the first pursuit of transatlantic flight is important, however, because it is also roughly equivalent to the length of time in which we have had social media. (The first blogs were generated in 1999, and took off by 2004. The intent of my blog today, however, is not to define social media, which will have to wait for another day). Blogs arrived in early 2000 and became heavy traffickers by 2010. Other sites naturally filtered in to fill niche markets. Sites like Photobucket and Flickr, Tumblr and Youtube generated a new way to use, share and create our own content. (During this time, Zuckerberg founded Facebook in 2004.) As social media sites visibly changed and grew with their markets, they also changed on the back end. Data-mining and information-gathering changed too. I think it is important to remember how revolutionary the internet was (and is!). Whereas with the Flyer and Ca 60 one could see the differences and reasons for construction, social media markets are much more subtle.

It seems to me that social media is less social and more media than we originally imagined. What is hidden may be more important than what is received. The way that we code documents, tag them, like them, share them, all create invisible data which now hangs onto the content in question, but also hangs onto the users. Ironically, this data is parsed and stored in a variety of middleman’s hands, on sites like Facebook and Twitter. In complete contrast to the airplane, the internet has masked invention in such a subtle way that the user is unaware of our own participation in invention.


When humans did achieve the first transatlantic flight, they had few navigational systems, and no bathrooms or heaters. Imagine Amelia Earhart or Charles Lindbergh, who were embraced for their spirit of adventure and bold daring. The first airplanes carried one person or a few people at their own cost and risk of their own life. Today, we use the internet more often than we use transportation and yet we understand it less. Its implications are creating profound effects upon our lives and yet we still cannot see the wheels or wings. How do we make transparent that which cannot be seen? How do we create a spirit of cooperation, much like the Wright brothers or Charles Lindbergh?

I am simply wondering if, as concepts become murkier and more nuanced, how do we educate a global population which is heavily dependent upon such technologies? The Ca 60’s first flight was short and its second, disastrous. Can we risk that of our websites and internet services? Yet, one idea often inspires the next. We are fortunate to have inventors willing to test their ideas, but what happens when the inventions risk issues of identity and truth? I ask this because I believe that future inventions will continue to be hidden from sight and we should find ways for dealing with such subtlety.

To post a comment, click on the title of today’s blog and scroll down.