Numa Creates the Calendar

July 21, 2017

Thanks to Alissa Simon, HMU Tutor, for today's post.

Last week we introduced a couple of less than mainstream calendars . This week, we want to move back into a look at the contemporary calendar, as based upon the Roman calendar. Julius Caesar, of course, attended to the discrepancies in the calendar. Astronomers of each age are challenged to find clever fixes for slight discrepancies, which, over a period of one thousand years, begins to add up. Caesar understood that growing seasons were being negatively affected by these seemingly minor errors and he corrected some of them. But his calendar was not the first Roman calendar. Other Roman emperors tampered with their own versions of a calendar, and often for less respectable reasons than Caesar. Some emperors wanted to place their names into the calendar as a sort of legacy. Others decided to celebrate festivals whenever they wanted, thus changing the custom and the calendar simultaneously.

Numa Pompilius (8th-7th century B.C.) was one of the first Roman emperors to set a fixed calendar. The following text comes entirely from Plutarch's chapter on Numa in his Lives of the Noble Grecians and Romans. It describes how the calendar came about from Plutarch's point of view. This discussion continues to develop our understanding of the cultural understanding of time, but also of the contemporary cultures who base their calendar on similar features. As societies fanned out, and the Roman civilization fell, threads of their society transferred to many other places. The transformation was not uniform, however, and so this investigation into time is meant simply to know more about the origin of our modern day customs.

“He attempted, also, the formation of a calendar, not with absolute exactness, yet not without some scientific knowledge. During the reign of Romulus, they had let their months run on without any certain or equal term; some of them contained twenty days, others thirty-five, others more; they had no sort of knowledge of the inequality in the motions of the sun and moon; they only kept to the one rule that the whole course of the year contained three hundred and sixty days. Numa, calculating the difference between the lunar and the solar year at eleven days, for that the moon completed her anniversary course in three hundred and fifty-four days, and the sun in three hundred and sixty-five, to remedy this incongruity doubled the eleven days, and every other year added an intercalary month, to follow February, consisting of twenty-two days, and called by the Romans the month Mercedinus. This amendment, ,however, itself, in course of time, came to need other amendments.

“He also altered the order of the months for March, which was reckoned the first, he put into the third place; and January, which was the eleventh, he made the first; and February, which was the twelfth and last, the second. Many will have it, that it was Numa, also, who added the two months of January and February; for in the beginning they had a year of ten months; as there are barbarians who count only three; the Arcadians, in Greece, had but four; the Acarnanians, six. The Egyptian year at first, they say, was of one month; afterwards, of four; and so, though they live in the newest of all countries, they have the credit of being a more ancient nation than any, and reckon in their genealogies, a prodigious number of years, counting months, that is, as years.

“That the Romans, at first, comprehended the whole year within ten, and not twelve months, plainly appears by the name of the last, December, meaning the tenth month; and that March was the first is likewise evident, for the fifth month after it was called Quintilis, and the sixth Sextilis, and so the rest; whereas, if January and February, in this account, preceded March, Quintilis would have been fifth in name and seventh in reckoning. It was also natural that March, dedicated to Mars, should be Romulus's first, and April, named from Venus, or Aphrodite, his second month; in it they sacrifice to Venus, and the women bathe on the calends, or first day of it, with myrtle garlands on their heads. But others, because of its being p and not ph, will not allow of the derivation of this word from Aphrodite, but say it is called April from aperio, Latin for to open, because that this month is high spring, and opens and discloses the buds and flowers. The next is called May, from Maia, the mother of Mercury, to whom it is sacred; then June follows, so called from Juno; some, however, derive them from the two ages, old and young, majores, being the name for older, and juniores for younger men. To the other months they gave denominations according to their order; so the fifth was called Quintilis, Sextilis the sixth, and the rest, September, October, November and December.

“Afterwards Quintilis received the name of Julius, from Caesar, who defeated Pompey; as also Sextilis that of Augustus, from the second Caesar, who had that title. Domitian, also, in imitation, gave the two other following months his own names, of Germanicus and Domitianus; but, on being slain, they recovered their ancient denominations of September and October. The two last are the only ones that have kept their names throughout without any alteration.

“Of the months which were added or transposed in their order by Numa, February comes from februa; and is as much a Purification month; in it they make offerings to the dead, and celebrate the Lupercalia, which, in most points, resembles a purification. January was so called from Janus, and precedence given to it by Numa before March, which was dedicated to the god Mars; because, as I conceive, he wished to take every opportunity of intimating that the arts and studies of peace are to be preferred before those of war.”

To post a comment, click on the title of this blog and scroll down.

Rare Calendars

July 14, 2017

Thanks to Alissa Simon, HMU Tutor, for today's post.

Merriam-Webster defines calendar as “a system for fixing the beginning, length, and divisions of the civil year and arranging days and longer divisions of time (such as weeks and months) in a definite order”. The reasons for developing such a system are easy to identify. It makes nearly all business navigable. Practicality aside, however, the idea of a calendar actually stemmed from those who noticed nature's rhythms. Early peoples noticed and came to expect that the sun would rise and set. Though time appears fixed as we move between scheduled appointments, it is easy to note that time moves slowly when we are in pain, and quickly when we are having fun. In nearly every possible scenario, humans note the passage of time.

“Kalend”, the Greek word for “I shout” predates the Roman “calends”, but both play a part in our understanding of time. The Greeks used to notify the public that taxes were due by shouting (thus the term kalend). Their taxes created an arbitrary, but fixed, timetable. Later, Romans used the word “calends” to describe the first day of a Roman month. These usages may have given us language for the development of the calendar itself, but they lack an understanding of celestial events. Most early calendars heavily relied upon nature as their guide. The Egyptians, for example, paid close attention to the cycles of floods. These periods paved the way to a successful civilization by allowing them to raise crops. In turn, the development of an accurate calendar was vital to the success of their crops. Though their calendar dates back 5,000 years, the ancient Egyptian astronomers created an extremely accurate calendar. Celestially-based calendars lack flexibility, however. And since early priests did not allow for change, after time, their calendar became disjointed from its intended purpose. Calendars, then, are a mix of celestial events, civic duties and cultural norms. The two calendars that follow may not be common knowledge, but they represent thought-projects regarding the human conception of time.

The French Revolutionary Calendar (or Republican Calendar) was established after France ended its brutal war in 1793. The new calendar sought to reject any ties to the previous monarchy or the Catholic Church. In other words, they completely abolished the Gregorian calendar. They removed religious holidays and completely changed the way that time was accounted for. This calendar was divided into 36 weeks, and each week included 10 days. They listed days numerically: Primidi, Duodi, Tridi, Quartidi, etc. The months focused on nature and natural events rather than gods or deities. The calendar year began on September 22, which was the date that the republic was established. The only holidays they celebrated came at the end of each calendar year during Sans Culottides. In these ceremonies, they honored Virtue, Genius, Labor, Opinion and Reward. This calendar did not last, however. Perhaps people did not appreciate the fact that the work week was extended from six days to nine. Whatever the reason, Napoleon abandoned this calendar on January 1, 1806 and returned to the Gregorian calendar. For more information, visit Calendars Through the Ages.

While the French Revolutionary Calendar arose as a response to war and devastation, the World Calendar came about mostly due to globalization. Its goal is to remove the complexity of change, thus fostering a more streamlined global world. They propose a simplified calendar of 364 days. The year is divided into four quarters which contain 3 months. Those three months all contain 91 days. They propose one calendar in which every date is fixed. For example, Christmas would always fall on a Monday. (One benefit of this calendar, is that you would only have to purchase one hard copy.) In order to compensate for the inadequacies, they propose World Day, a worldwide celebration the day after December 30. They also include a Leap Year Day once every four years. According to their website, the calendar would bring peace and stability to the ever-globalizing world. The World Calendar could even be memorized. As a result, supporters claim that it would enable smoother business transactions between highly diverse cultures. The major deficit of this calendar is its inability to account for religious holidays, cultural differences and minorities. It appears to be a very business-like solution to what is often a very culturally-laden term. In structuring the holidays, they may offend many religions or cultures who rely on lunar calendars and established holy rules. Flaws included, however, the theoretical process is interesting. For more, visit the website for the World Calendar.

Time is a social construct. We use it to convey information in our sentences (past, present and future, for example) and to make plans with colleagues, family and friends. Watches and clocks, systematic constructs, enable us to more accurately function. And yet, time has as much to do with pace or tempo as it does with logic and accuracy. For example, try to gain a concrete understanding of the passage of time from someone's oral history. It is nearly impossible. In memory, time functions fluidly, not logically or chronologically. A chain of events string together in the memory, but they may not be a factual representation of the events, just one person's view of them. Also, the important points seem to take longer, whereas the unimportant stuff is swept aside. Again, importance differs from person to person. Starting conversations and creating dialogue with the purpose of understanding someone's personal view of time is endlessly interesting...especially outside of your own country.

To post a comment, click on the title of this blog and scroll down.

Mundane Inventions

July 7, 2017

Thanks to Alissa Simon, HMU Tutor, for today's post.

The list that follows are a few inventions that we often take for granted. These mundane items offer a way of analyzing the cultural values and technologies on which we rely (sometimes without knowing it!). Enjoy!

Genetically Modified Organisms (GMO):

Though highly contentious and ethically problematic, GMO has made it to the mainstream. This is less of an invention that we take for granted, and more of one that we know little about. Only the savvy consumer realizes that GMO items are in our clothes, food, and products. Originally, farmers saved seeds as the best method of reproduction. Unfortunately, crops were susceptible to pests and weather, so genetic modification made them heartier. As with all inventions, however, there is a potential fallout from tampering with nature. Making plants resistant also made insects evolve. There are many pros and cons, but it is worth investigating since a lot of the products that we depend upon also depend upon genetic modification. A better understanding of the science behind both the pros and cons can be found here.

Toilet Paper:

This might be the most mundane item on today's list, but an indispensible one. Yet, it is a relatively modern development, and one that inspires very little conversation. As with all inventions that have arisen to such mainstream status, it is good to look at the product from the outside to determine how it can be improved. As you can imagine, toilet paper first became available to royalty. The sheet size and makeup greatly differed depending upon culture and technology. However, it was not until the late 1800s that toilet paper became its own product. Closely linked to the paper-making process, it rose to popularity after Scott Paper Company placed it on rolls in 1890. Find a full history in the Toilet Paper Encyclopedia.


The idea of disposable is fascinating to unpack. Merriam-Webster notes that its first known use goes back to 1643, but has little information on that usage. Instead, the first applications of disposable products arrives in the late 1900s, with the rise of such products as disposable diapers and disposable spoons and cups. As we begin to throw away items of luxury, so too, disposable becomes an adjective which describes things like income. Disposable income, Merriam-Webster claims, is “income available for disposal”. I find that definition confusing at best. Today, our society heavily depends upon disposable products such as gloves, diapers and cups, just to name a few. It is important to look at the things we rely upon to better understand our current culture, as well as gauge what might be best for the future. For example, while disposable gloves have certainly helped the medical field, are disposable cups a necessity? Meant for travel or emergencies, many disposable objects have become mainstream, daily requirements. This trend directly correlates to our increasingly mobile lives. The conversation leads into a wonderful discussion of what is culturally beneficial, helpful or otherwise, necessary.

Find more on disposable cups or disposable diapers or disposable gloves.


Did you know that, according to Merriam-Webster, pocket can mean “a small bag carried by person”? This seems odd at first because a pocket is clearly not a bag. Yet, they did originate from bags, which is why Merriam-Webster also lists “a small bag sewn into a garment” as an alternative definition. The pocket, a seemingly mundane object, has a dozen turns of phrase. It has gained a number of metaphors because of its utility and significance. And maybe because of its secret contents. Authors are quick to pick up on such cues, and we find mentions of pockets strewn throughout literature.

Women's pockets and men's pockets developed separately and for slightly different reasons. Other than the fact that both seem to have originated as well-hidden spaces, they diverged from there. Women nowadays often carry money, phones and whatever else in a purse or handbag. Prior to that, mimicking men's pockets, women resorted to well hidden belts. (These greatly challenged the skills of a pick-pocket, which was a pretty sophisticated crime in retrospect). The original pockets would have been hidden under layers of clothing and attached to a belt of sorts, rather than their actual clothes. Women's pockets were difficult to access, whereas men's pockets were sewn into their interior coat linings or inside their suits.

Men have often used pockets for money, food and cigarettes or whatever else they might need. Prior to that, men often used knapsacks to carry food, hunting supplies or supplies. Since the invention of the pocket, men's clothing has found suitable ways of creating useful pockets. Size is understandably of importance: too large makes clothing overly bulky and too small makes the pocket useless. The Victoria and Albert Museum claims, “In contrast to the delicate, embroidered pockets of the 18th century, those of the 19th century are larger and quite plain.” This may be true of men's suits, but women's pockets remained fiercely attached to fashion trends, which often interrupt utility. It appears that women's clothing continues to create designs based off of look rather than purpose (though it can be argued that aesthetics serve a purpose, but that is discussion for a different day). Is it really that surprising to find that something so mundane as a pocket has a political history too?!


Food is a wormhole of investigation. One thing leads to another and really, it can be successfully combined in so many ways since success depends upon your very subjective tastebuds. Imagine, however, trying to create a delicious loaf of bread without a recipe. Not so long ago, recipes depended upon terms that lacked any specificity. Terms like “a dash” or a “handful” or “large” mean nothing to the inexperienced cook. Trial and error rules the day with recipe development. It is painstaking and often extremely aggravating. At times, however, a successful recipe grants a certain level of pride. This very interesting (and ambitious) website attempts to offer a timeline of foods, providing rough dates and cultural attachments. It is a fascinating journey through human civilization.

To post a comment, click on the title of this blog and scroll down.

A Blog Is A Blog

June 30, 2017

Thanks to Alissa Simon, HMU Tutor, for today's post.

What is a blog? While unofficial, it appears that the first blog dates back to 1994. Weblogs, coined in 1997, became plain old blogs in 1999. Then, as their popularity rose, Merriam-Webster presented it as the word of the year in 2004. Back then, the word was defined as, “Online journal where the writer presents a record of activities, thoughts, or beliefs.”

Blogs continue to be a space for contemplation, ideas, crafts, words or sharing your favorite pieces of culture. They have greatly expanded due to the converging rise of Do-It-Yourself projects. Merriam-Webster now defines blog as “a website that contains online personal reflections, comments, and often hyperlinks, videos, and photographs provided by the writer.” In a second definition, Merriam-Webster claims that a blog can also be associated with an online publication that “relates to a particular topic and consists of articles and personal commentary by one or more authors.” An important aspect of both definitions is that they rely on the term “personal”. While writers always share something personal, there is movement away from the idea of a professional writer, into more of an amateur field.

There are many reasons for the desire to share something personal. However, personal implies that the entire conversation is personal. In other words, it is a conversation typically reserved for an audience among family and friends. Precisely who is included in our personal circle? Our thoughts are certainly personal, and yet, the rise in blogging suggests that humans have a need to more widely distribute their own thoughts. Does a blog offer effective contemplation, conversation? Does it provide a necessary and useful format for society? Or should blogs be relegated to personal interest?

Blogs reflect, I believe, the way in which our societal structure has changed over the past thirty to fifty years. Neighborhoods no longer define community. Instead, we create community through schools, interest groups, activities, churches and family structures. As society alters the style of our community, so does our style of communication. In part, these arose simultaneously. For example, we have access to transportation and communication devices with a fair amount of ease. Our ability to text, call, email, or facetime enables us to travel great distances without leaving our homes. It also allows us the freedom to make plans and change them up to the moment. Transportation grants the freedom to make plans in any number of locations. We can visit friends all over the world with relative ease. And while it is not impossible to maintain strong connections through words alone, visiting certainly helps.

Having this great power of movement, however, also changes the dynamics of our close relationships. While many studies show a correlation between good health and positive relationships, society continues to rely on social media as one form of relationship. I wonder, therefore, how healthy that relationship is for the human psyche and does it fit the need that we need it to fill?

One potentially problematic aspect of blogs is that the writer can claim anything. For the most part, there is no editor or fact-checker. Whether looking up information about cooking, crafting, politics or historical fact, it is likely that you will stumble upon nearly every side of a coin, regardless of fact. Also, it may be difficult to find the information that you need. Searching for a particular issue, may actually lead you astray. In other words, the reader must do their own homework since searchability and reliability remain unresolved issues of blogs.

Having said that, I believe that blogs provide a space in which we can enhance our levels of contemplation. For example, writing offers many potential benefits. A society which writes must be thinking about a wide variety of issues, entertainments and interests. I like the idea that we can form a web of communication with others whom we do not know, have never met and are unlikely to meet. It has the potential to bring us together in contemplation and discussion, not necessarily in agreement. It seems important to support a society of writers and thinkers. To my mind, this is the best that a blog community can offer: serious contemplation of any subject, coupled by thoughtful commentary.

However, the most glaring drawback of blog community is the lack of personal interaction. Without the handshake, hug, facial expression or physical presence, some people feel it is acceptable to write something that would be deemed inappropriate in a social setting. It is as if we enable an internal editor when speaking publicly, but dissociate ourselves from this very same editing device when speaking electronically. This divide seriously puzzles and frightens me.

I hope that as the blogging community grows, our awareness of socially appropriate speech will re-engage, that we will be reminded of the power of speech, of courtesy and grace. I enjoy presenting my thoughts in dialogue and I appreciate the responses that articulate both thoughtful approval and dissent. While I still much prefer human interaction and direct conversation, I can see the potential service that blogs may provide.

To post a comment, click on the title of this blog and scroll down.