Ethnically, I am a "white" person, of Anglo-Saxon ancestry, brought up as a Protestant Christian. Furthermore, I am a male, raised in a family that would probably be characterized as "upper-middle class".
Though this may change in the future, throughout my lifetime so far these things have placed me in the "dominant" group, with many advantages over others in our society, especially in terms of acquiring wealth and power, and in general having an easier life. I could spend the rest of my life discussing/debating these advantages, but for now I will simply accept the idea that I HAVE had many advantages.
There is at least one area in which being a WASP male puts me at a disadvantage: If I dare to discuss things like race relations or gender inequities, my opinions are automatically discounted, and viewed as being suspect and less valid than those of any in the less-advantaged groups. I will always be vulnerable to accusations of "You just don't get it!" and "There is no way you could ever understand!" These statements are entirely correct.
I realize that in the grand scheme of things, this is not much to complain about, especially when compared with having to face the disadvantages that come from NOT being a white Anglo-Saxon Protestant male from an upper-middle-class family -- sort of like complaining about a shortage of some item at the Country Kitchen Buffet when there are people in the world who are truly starving. Still, it is a real problem when attempting to discuss ideas. The idea of ANYONE having their opinion automatically discounted seems contrary to the idea of communication.
Furthermore, I would argue that ultimately NO ONE can ever truly comprehend another person's situation -- someone who appears to be well off may in fact have overwhelming problems, or someone who appears to be in a horrible situation may be quite happy, or, perhaps most relevant, someone who appears to be in a situation similar to our own may be in a situation that is quite different. The idea of rejecting someone's opinions based upon the idea that they cannot understand another person's situation may be perfectly logical and valid, but it would lead to rejecting EVERYONE'S opinion except your own.
This is a difficult, complicated subject, and I do not claim to have any answers. No one is capable of true objectivity, and no one is capable of pure empathy, no matter how hard we try. I do not claim to be able to comprehend the experiences of people in different situations, or of anyone different from myself, and I am not sure how well I do at comprehending my own situation. Still, I reject the idea that anyone's opinions have no value, and I will continue to offer my own opinions, however flawed.
Friday, December 3, 2010
Thursday, November 25, 2010
Worldly goods
It is popular, at least in our affluent society, to express sentiments such as “You can’t buy happiness”, or “The best things in life are free”. After a fire or other event involving the loss or possessions but not life, we console ourselves and other victims by stating that the lost items were “just THINGS”.
Many have come forward to suggest that perhaps you CAN buy happiness. Statements along the lines of “I’ve been poor and I’ve been rich. Rich is better!” are credited to various celebrities (most notably Sophie Tucker, although the current thinking suggests that the phrase is best credited to Beatrice Kaufman, if credited to any single individual). (Most recently, I ran across some research suggesting that there is a specific amount that benefits happiness. I believe the idea was that anything up to $75,000 does indeed increase happiness, while additional wealth does not offer much benefit.) The facts are contradictory. While there are certainly some possible advantages to wealth and possessions, there are certainly plenty of unhappy rich people.
Another widespread saying in recent years is “Don’t sweat the small stuff”. This idea is also used in conjunction with the “just THINGS” argument -- possessions are intrinsically “small stuff”, while things like your health and your relationships are IMPORTANT.
Years ago, I started telling myself to “sweat the small stuff”. Though my full logic is complicated and perhaps convoluted, it involves the concept that perhaps it is better to worry about and struggle with the small things, things that perhaps you can actually hope to control, rather than the truly overwhelming things like your health and your relationships.
Along the same lines, I have realized I can find great joy in some of my possessions -- and I do not mean status or luxury items like a sports car, a boat, or fancy jewelry. For example, I have a little plastic bottle-top that snaps on to a pop/soda can, and temporarily turns it into a bottle with a screw top (you can now buy these devices in multi-packs from ads on television, but I got mine when you could purchase them singly). Some/many would consider such a device to be pointless at best and stupid/wasteful at worst, but using it can make me happy during otherwise unhappy times. I am also quite fond of my new mechanical pencil.
This is a complicated topic. I am NOT saying that you CAN buy happiness, or that the best things in life are possessions, or even that you should ignore the “big stuff” like health and relationships. I AM saying that it’s okay to take some joy in the small stuff, and it’s okay to enjoy your “things”. Life can be hard, and if you can find some happiness in making a really great sandwich, or possessing a particularly nice-looking guitar pick, go for it!
Truth is complicated.
Many have come forward to suggest that perhaps you CAN buy happiness. Statements along the lines of “I’ve been poor and I’ve been rich. Rich is better!” are credited to various celebrities (most notably Sophie Tucker, although the current thinking suggests that the phrase is best credited to Beatrice Kaufman, if credited to any single individual). (Most recently, I ran across some research suggesting that there is a specific amount that benefits happiness. I believe the idea was that anything up to $75,000 does indeed increase happiness, while additional wealth does not offer much benefit.) The facts are contradictory. While there are certainly some possible advantages to wealth and possessions, there are certainly plenty of unhappy rich people.
Another widespread saying in recent years is “Don’t sweat the small stuff”. This idea is also used in conjunction with the “just THINGS” argument -- possessions are intrinsically “small stuff”, while things like your health and your relationships are IMPORTANT.
Years ago, I started telling myself to “sweat the small stuff”. Though my full logic is complicated and perhaps convoluted, it involves the concept that perhaps it is better to worry about and struggle with the small things, things that perhaps you can actually hope to control, rather than the truly overwhelming things like your health and your relationships.
Along the same lines, I have realized I can find great joy in some of my possessions -- and I do not mean status or luxury items like a sports car, a boat, or fancy jewelry. For example, I have a little plastic bottle-top that snaps on to a pop/soda can, and temporarily turns it into a bottle with a screw top (you can now buy these devices in multi-packs from ads on television, but I got mine when you could purchase them singly). Some/many would consider such a device to be pointless at best and stupid/wasteful at worst, but using it can make me happy during otherwise unhappy times. I am also quite fond of my new mechanical pencil.
This is a complicated topic. I am NOT saying that you CAN buy happiness, or that the best things in life are possessions, or even that you should ignore the “big stuff” like health and relationships. I AM saying that it’s okay to take some joy in the small stuff, and it’s okay to enjoy your “things”. Life can be hard, and if you can find some happiness in making a really great sandwich, or possessing a particularly nice-looking guitar pick, go for it!
Truth is complicated.
Wednesday, November 17, 2010
Juan Williams
I had just finished typing a post about National Public Radio when I began to hear news about the firing of Juan Williams.
I am no expert on Juan Williams. I became aware of him because he was the regular host of National Public Radio’s afternoon call-in program, “Talk of the Nation”. For a long time, I knew him only from the radio, but I enjoyed and respected him. Later, I occasionally saw him on television.
According to published reports, Juan Williams was fired from NPR after making some statements during a discussion about terrorism on “The O’Reilly Factor” on Fox News Channel. Specifically, he is quoted as saying, “But when I get on a plane, I got to tell you, if I see people who are in Muslim garb and I think, you know, they're identifying themselves first and foremost as Muslims, I get worried. I get nervous." This was just one excerpt from a much longer discussion, but it seems to be the crucial part in terms of his firing.
However, since the firing, I have been hearing and reading that Juan Williams has been in trouble with NPR for quite some time. According to The Washington Post: “NPR officials say they have repeatedly told Williams that some of his statements on Fox violate NPR's ground rules for its news analysts. The rules ban NPR analysts from making speculative statements or rendering opinions on TV that would be deemed unacceptable if uttered on an NPR program. The policy has some gray areas, they acknowledged, but it generally prohibits personal attacks or statements that negatively characterize broad groups of people, such as Muslims.”
The same Washington Post article notes that Williams states that his specific contract with NPR exempted him from some of these rules governing other NPR employees.
I personally am slightly -- but only slightly -- conflicted on this complicated issue. In general, I tend to believe an employer can make any rules they want, and the employee has the option of following those rules or finding another job. In this case, though, the rules themselves seem wrong, whether or not Juan Williams was bound by them -- especially since this is National PUBLIC Radio. More importantly, it is an oversimplification to claim that Williams actually made a personal attack or a statement that negatively characterized a broad group of people.
The TRUTH -- which is complicated -- is that rather than attacking or criticizing Muslims, Williams was confessing one of his own failings. He was stating one of his inner feelings; a secret, possibly irrational fear of which he was not necessarily proud. He did NOT say “Muslims are terrorists” or “Muslims are bad” or even “Muslims should not be allowed on planes”. If anything, he was attacking and criticizing HIMSELF.
The thing that puzzles me MOST about this issue is that there seem to be some thoughtful, rational, intelligent people who support the firing. This makes me wonder whether there is some aspect of the issue that I am failing to see or consider.
For now, though, I am deeply troubled by the firing of Juan Williams from NPR. Here is MY confession: Since the firing of Juan Williams, I have been listening much less to National Public Radio, on a fairly conscious level. I suppose it is some sort of protest, but I do not know what I am thinking I am accomplishing.
Truth is complicated.
I am no expert on Juan Williams. I became aware of him because he was the regular host of National Public Radio’s afternoon call-in program, “Talk of the Nation”. For a long time, I knew him only from the radio, but I enjoyed and respected him. Later, I occasionally saw him on television.
According to published reports, Juan Williams was fired from NPR after making some statements during a discussion about terrorism on “The O’Reilly Factor” on Fox News Channel. Specifically, he is quoted as saying, “But when I get on a plane, I got to tell you, if I see people who are in Muslim garb and I think, you know, they're identifying themselves first and foremost as Muslims, I get worried. I get nervous." This was just one excerpt from a much longer discussion, but it seems to be the crucial part in terms of his firing.
However, since the firing, I have been hearing and reading that Juan Williams has been in trouble with NPR for quite some time. According to The Washington Post: “NPR officials say they have repeatedly told Williams that some of his statements on Fox violate NPR's ground rules for its news analysts. The rules ban NPR analysts from making speculative statements or rendering opinions on TV that would be deemed unacceptable if uttered on an NPR program. The policy has some gray areas, they acknowledged, but it generally prohibits personal attacks or statements that negatively characterize broad groups of people, such as Muslims.”
The same Washington Post article notes that Williams states that his specific contract with NPR exempted him from some of these rules governing other NPR employees.
I personally am slightly -- but only slightly -- conflicted on this complicated issue. In general, I tend to believe an employer can make any rules they want, and the employee has the option of following those rules or finding another job. In this case, though, the rules themselves seem wrong, whether or not Juan Williams was bound by them -- especially since this is National PUBLIC Radio. More importantly, it is an oversimplification to claim that Williams actually made a personal attack or a statement that negatively characterized a broad group of people.
The TRUTH -- which is complicated -- is that rather than attacking or criticizing Muslims, Williams was confessing one of his own failings. He was stating one of his inner feelings; a secret, possibly irrational fear of which he was not necessarily proud. He did NOT say “Muslims are terrorists” or “Muslims are bad” or even “Muslims should not be allowed on planes”. If anything, he was attacking and criticizing HIMSELF.
The thing that puzzles me MOST about this issue is that there seem to be some thoughtful, rational, intelligent people who support the firing. This makes me wonder whether there is some aspect of the issue that I am failing to see or consider.
For now, though, I am deeply troubled by the firing of Juan Williams from NPR. Here is MY confession: Since the firing of Juan Williams, I have been listening much less to National Public Radio, on a fairly conscious level. I suppose it is some sort of protest, but I do not know what I am thinking I am accomplishing.
Truth is complicated.
Thursday, November 11, 2010
National Public Radio
Note: I had already written most of this when I began hearing news about the firing of Juan Williams from NPR. Although everything I have written here remains valid, I have been hesitant to post this since my feelings about NPR are currently dominated by the subject of the firing. I will post a separate entry about Juan Williams.
I often listen to National Public Radio while driving my car. This arose partly from the fact that at my home, the signal for their stations tends to be weak. In addition to music, I hear many interesting, thought-provoking, and profound things on National Public Radio, while driving my car.
There are various problems associated with this. For one thing, I often miss the beginning and/or end of a story or interview. I also tend to miss or forget the details. If I were reading, or even watching TV in the comfort of my home, I would be more likely to catch people’s names, or the titles of books they have written, and I could further research the information later, at my leisure. (I realize it is POSSIBLE to later track down information heard on National Public Radio, but it can be difficult, especially if my memory is hazy since I was focused on my driving.)
For example, a few months ago I heard part of a fascinating interview with someone who had written a book about something. I was especially impressed with some comments the author made about ... I believe he used the phrase “group identity”. As I recall, it was the author’s contention that the phenomenon of group identity -- the idea that people identify themselves by their nationality, region, race, religion, or an infinite number of other factors -- is a horribly destructive force, and at least partially responsible for much or most of the conflict in our world, and for many other problems as well. He said some eloquent, profound things, and I would like to read more of what he had to say, and I would like to be able to quote him, and link to him in this blog ... but for now, all I can say is that he was “someone who had written a book about something”.
I also have heard interesting comments from “some guy who used to be a high official in Iraq” and “four or five experts on International Law” -- and sometimes I try to quote them, or make other references to the information I heard, but it is always vague and impossible to verify or research. They will probably show up in this blog at some point.
Since the firing of Juan Williams from NPR, I have been considering the extent to which NPR is biased toward certain views. I have always been aware of this bias, but felt it did not interfere substantially with the information I gained from listening, especially since I tend to pay more attention to their guests than to the actual NPR staff.
Now I wonder: If I fundamentally disagree with the policies of NPR management, then am I somehow supporting these policies just by tuning in on the radio? If I attend a lecture or rally, and I fundamentally disagree with the opinions being put forward, does my mere presence at the event imply or deliver some kind of support?
Truth is complicated.
I often listen to National Public Radio while driving my car. This arose partly from the fact that at my home, the signal for their stations tends to be weak. In addition to music, I hear many interesting, thought-provoking, and profound things on National Public Radio, while driving my car.
There are various problems associated with this. For one thing, I often miss the beginning and/or end of a story or interview. I also tend to miss or forget the details. If I were reading, or even watching TV in the comfort of my home, I would be more likely to catch people’s names, or the titles of books they have written, and I could further research the information later, at my leisure. (I realize it is POSSIBLE to later track down information heard on National Public Radio, but it can be difficult, especially if my memory is hazy since I was focused on my driving.)
For example, a few months ago I heard part of a fascinating interview with someone who had written a book about something. I was especially impressed with some comments the author made about ... I believe he used the phrase “group identity”. As I recall, it was the author’s contention that the phenomenon of group identity -- the idea that people identify themselves by their nationality, region, race, religion, or an infinite number of other factors -- is a horribly destructive force, and at least partially responsible for much or most of the conflict in our world, and for many other problems as well. He said some eloquent, profound things, and I would like to read more of what he had to say, and I would like to be able to quote him, and link to him in this blog ... but for now, all I can say is that he was “someone who had written a book about something”.
I also have heard interesting comments from “some guy who used to be a high official in Iraq” and “four or five experts on International Law” -- and sometimes I try to quote them, or make other references to the information I heard, but it is always vague and impossible to verify or research. They will probably show up in this blog at some point.
Since the firing of Juan Williams from NPR, I have been considering the extent to which NPR is biased toward certain views. I have always been aware of this bias, but felt it did not interfere substantially with the information I gained from listening, especially since I tend to pay more attention to their guests than to the actual NPR staff.
Now I wonder: If I fundamentally disagree with the policies of NPR management, then am I somehow supporting these policies just by tuning in on the radio? If I attend a lecture or rally, and I fundamentally disagree with the opinions being put forward, does my mere presence at the event imply or deliver some kind of support?
Truth is complicated.
Friday, November 5, 2010
Monty Python
And now for something completely different.
Being a "Python" is a bit like being a "Trekkie" -- though we do not have the universally-agreed-upon cool nickname.
Fans of "Monty Python's Flying Circus" share a unique world.
I stumbled upon the "Monty Python" television show late one evening in the early 1970s. (Incidentally, for those with knowledge of Monty Python, the FIRST sketch I saw was the "killer joke" sketch.) I had seen "Flying Circus" listed in the TV schedule (which at that time omitted the "Monty Python's"), but never tuned it in, and assumed it had something to do with flying or at least a circus. I had not heard it mentioned by any of my friends, but I believe several of them were stumbling upon it at about the same time. Eventually, word or mouth kicked in, but I honestly believe several of us just happened upon it independently.
I could spend paragraphs or pages describing how the television show spread through and permeated our lives. Suffice it to say that it became a sort of language, shared by a growing fraternity of fans. A single phrase, or even a single word, could elicit chortles from those who were "in the know." Perhaps the most curious part is that the single phrase or word may have appeared only once in the television show, yet lives on to this day in the minds of Monty Python fans, and even beyond the minds of Monty Python fans. It is generally believed that the use of the term "spam" in regard to e-mail originated with a Monty Python sketch involving the "Spam" meat product. Climbers on Everest have greeted each other as "Bruce", possibly not even realizing they are referencing a Monty Python sketch in which everyone was named "Bruce."
Even now, decades since the filming of the television show, phrases from Monty Python sketches routinely pop into my head, and I sometimes say them out loud, and often those around me are simply puzzled, though occasionally there is a flash of recognition, and perhaps an appropriate reply.
My point, though, is not to discuss how experiences leave permanent "marks" on our thinking, and how shared experiences -- even shared across continents -- may result in mutual reference points for future communication and bonding. My POINT is to confess and acknowledge that my own thinking has been forever influenced, to an unknown extent, by "Monty Python's Flying Circus". I have no regrets about this, and in fact recommend it to those who have not immersed themselves in that world.
I suppose that legally I should point out that no one from Monty Python is paying me or otherwise rewarding me for recommending them.
Since I have no particular way to conclude these comments, I will mention that often when the Monty Python television show had no way to conclude a sketch, they would have a knight in armor trot out and hit someone with a rubber chicken. When I have no way to conclude a blog posting, I write "Truth is complicated."
Truth is complicated. (Visualize a knight in armor striking someone with a rubber chicken ... feel free to do that anytime I write "Truth is complicated.")
Being a "Python" is a bit like being a "Trekkie" -- though we do not have the universally-agreed-upon cool nickname.
Fans of "Monty Python's Flying Circus" share a unique world.
I stumbled upon the "Monty Python" television show late one evening in the early 1970s. (Incidentally, for those with knowledge of Monty Python, the FIRST sketch I saw was the "killer joke" sketch.) I had seen "Flying Circus" listed in the TV schedule (which at that time omitted the "Monty Python's"), but never tuned it in, and assumed it had something to do with flying or at least a circus. I had not heard it mentioned by any of my friends, but I believe several of them were stumbling upon it at about the same time. Eventually, word or mouth kicked in, but I honestly believe several of us just happened upon it independently.
I could spend paragraphs or pages describing how the television show spread through and permeated our lives. Suffice it to say that it became a sort of language, shared by a growing fraternity of fans. A single phrase, or even a single word, could elicit chortles from those who were "in the know." Perhaps the most curious part is that the single phrase or word may have appeared only once in the television show, yet lives on to this day in the minds of Monty Python fans, and even beyond the minds of Monty Python fans. It is generally believed that the use of the term "spam" in regard to e-mail originated with a Monty Python sketch involving the "Spam" meat product. Climbers on Everest have greeted each other as "Bruce", possibly not even realizing they are referencing a Monty Python sketch in which everyone was named "Bruce."
Even now, decades since the filming of the television show, phrases from Monty Python sketches routinely pop into my head, and I sometimes say them out loud, and often those around me are simply puzzled, though occasionally there is a flash of recognition, and perhaps an appropriate reply.
My point, though, is not to discuss how experiences leave permanent "marks" on our thinking, and how shared experiences -- even shared across continents -- may result in mutual reference points for future communication and bonding. My POINT is to confess and acknowledge that my own thinking has been forever influenced, to an unknown extent, by "Monty Python's Flying Circus". I have no regrets about this, and in fact recommend it to those who have not immersed themselves in that world.
I suppose that legally I should point out that no one from Monty Python is paying me or otherwise rewarding me for recommending them.
Since I have no particular way to conclude these comments, I will mention that often when the Monty Python television show had no way to conclude a sketch, they would have a knight in armor trot out and hit someone with a rubber chicken. When I have no way to conclude a blog posting, I write "Truth is complicated."
Truth is complicated. (Visualize a knight in armor striking someone with a rubber chicken ... feel free to do that anytime I write "Truth is complicated.")
Friday, October 29, 2010
Halloween
Halloween is a complicated holiday.
In a future post I will write about my affection for MOST holidays, including Halloween.
I suppose most holidays have related controversies. Since I have no objection to the observance of Halloween, I cannot accurately state what makes Halloween controversial, but it seems to be related to the idea that Halloween, at least in its present-day form, deals with ghosts and witches and monsters, and possibly violence or at least mischief.
As I was growing up in the American Midwest, one of our pre-Halloween customs was to sneak around in the evenings preceding Halloween and throw handfuls of shelled field corn at our neighbors' windows. This custom has largely died out in this community, though I do not know why. Even while I was still participating in the custom in the 1960s, there was talk that the corn left on the ground might draw rodents and especially rats, but I never viewed that as a legitimate concern. In slightly earlier days, children made a device that my father called a "tic-tac", though I have just found on the web referred to as a "tic-tac-toe" (this link seems a little iffy, but there is a thorough reference to tic-tacs at http://books.google.com/books?id=TKBGvbp7TEYC&pg=PA147). The tic-tac made a similar sound to corn hitting the window, but more dramatic, and required the mischief-maker to actually stand beside the window, so more courage was necessary, and there was more danger of being caught by those inside the house -- which is perhaps why they were rarely used in my childhood. With both the corn and the tic-tac, the objective was to startle those inside the house, which would possibly cause them to give chase. No other damage was intended.
Another prominent pre-Halloween activity in my youth was "soaping windows". Like a tic-tac, this required the mischief-maker to step right up to the window, and smear a bar of soap on it, possibly writing something or making a design. I can honestly say that I never soaped a single window. Though there was no permanent damage, the practice seemed a bit malicious rather than fun.
A custom that persists to this day is the smashing of Halloween carved pumpkins, or "jack-o'-lanterns". This was not common in my youth. I find it highly objectionable, partly because the pumpkin-carver is often a younger child, and the pumpkin-smasher is often an older child who gets some thrill out of destroying an object of happiness belonging to a young stranger. Whether the pumpkin-carver was young or old, the act of carving the pumpkin had no particular "purpose" other than to bring pleasure, perhaps both to the carver and those who would see it later, and the act of smashing it is just senseless and mean, giving no benefit to the smasher other than the pleasure of depriving other people of happiness and the destruction of the product of some one's labor, not to mention their property. Pumpkin-smashing is abhorrent.
I know that there are people who object to the observance of Halloween on religious grounds. I see no particular basis for their objections, but perhaps this is just the product of my own lack of understanding of their objections.
One of the things that I enjoy about Halloween is that I feel that I am carrying on a centuries-old tradition -- a product of simpler, possibly more mysterious times -- and in some way showing respect for the past. I do NOT feel I am showing respect or disrespect for any particular religion or belief system. I also feel that I am sharing fun with my neighbors.
I have not even mentioned "Trick-or-treating". This custom has also evolved over time and since my youth, but it persists in various forms, and I am happy it persists, though I am mostly happy for the sake of the children.
As for me, I will carve my jack-o'-lantern, and put a few decorations in the windows, and wish everyone a Happy Halloween. Happy Halloween to YOU!
In a future post I will write about my affection for MOST holidays, including Halloween.
I suppose most holidays have related controversies. Since I have no objection to the observance of Halloween, I cannot accurately state what makes Halloween controversial, but it seems to be related to the idea that Halloween, at least in its present-day form, deals with ghosts and witches and monsters, and possibly violence or at least mischief.
As I was growing up in the American Midwest, one of our pre-Halloween customs was to sneak around in the evenings preceding Halloween and throw handfuls of shelled field corn at our neighbors' windows. This custom has largely died out in this community, though I do not know why. Even while I was still participating in the custom in the 1960s, there was talk that the corn left on the ground might draw rodents and especially rats, but I never viewed that as a legitimate concern. In slightly earlier days, children made a device that my father called a "tic-tac", though I have just found on the web referred to as a "tic-tac-toe" (this link seems a little iffy, but there is a thorough reference to tic-tacs at http://books.google.com/books?id=TKBGvbp7TEYC&pg=PA147). The tic-tac made a similar sound to corn hitting the window, but more dramatic, and required the mischief-maker to actually stand beside the window, so more courage was necessary, and there was more danger of being caught by those inside the house -- which is perhaps why they were rarely used in my childhood. With both the corn and the tic-tac, the objective was to startle those inside the house, which would possibly cause them to give chase. No other damage was intended.
Another prominent pre-Halloween activity in my youth was "soaping windows". Like a tic-tac, this required the mischief-maker to step right up to the window, and smear a bar of soap on it, possibly writing something or making a design. I can honestly say that I never soaped a single window. Though there was no permanent damage, the practice seemed a bit malicious rather than fun.
A custom that persists to this day is the smashing of Halloween carved pumpkins, or "jack-o'-lanterns". This was not common in my youth. I find it highly objectionable, partly because the pumpkin-carver is often a younger child, and the pumpkin-smasher is often an older child who gets some thrill out of destroying an object of happiness belonging to a young stranger. Whether the pumpkin-carver was young or old, the act of carving the pumpkin had no particular "purpose" other than to bring pleasure, perhaps both to the carver and those who would see it later, and the act of smashing it is just senseless and mean, giving no benefit to the smasher other than the pleasure of depriving other people of happiness and the destruction of the product of some one's labor, not to mention their property. Pumpkin-smashing is abhorrent.
I know that there are people who object to the observance of Halloween on religious grounds. I see no particular basis for their objections, but perhaps this is just the product of my own lack of understanding of their objections.
One of the things that I enjoy about Halloween is that I feel that I am carrying on a centuries-old tradition -- a product of simpler, possibly more mysterious times -- and in some way showing respect for the past. I do NOT feel I am showing respect or disrespect for any particular religion or belief system. I also feel that I am sharing fun with my neighbors.
I have not even mentioned "Trick-or-treating". This custom has also evolved over time and since my youth, but it persists in various forms, and I am happy it persists, though I am mostly happy for the sake of the children.
As for me, I will carve my jack-o'-lantern, and put a few decorations in the windows, and wish everyone a Happy Halloween. Happy Halloween to YOU!
Wednesday, October 27, 2010
Blogger "User Profile"
The "blogger" service offers a way to generate a "User Profile" using the blogspot template. Perhaps if I was wiser and more experienced, I could figure out a way to bypass or at least modify their template, but for now, their template is what I have available.
When you click on "Edit Profile" it brings up eight sections: Privacy, Identity, Photograph, Audio Clip, General, Location, Work, and Extended Info.
Under "Privacy", there are checkboxes for "Share my profile", "Show my real name", "Show my email address", "Show my blogs", and "Show sites I follow". The entire "Share my profile" question would seem to depend partly on what information my profile contains. For now, I have decided to keep my identity at least somewhat private, for a variety of complicated reasons -- so I won't be showing my real name, however I will show an e-mail address, though e-mail inboxes can quickly fill with spam. My first blogs were pretty much "practice" blogs, so there doesn't seem to be any point in showing them. At the moment, I don't particularly "follow" any sites; if I DID, I'm not sure I would want to share that info.
Under "Identity" we have "Username" (required), "Email Address", "Display Name" (required), "First Name", and "Last Name". Most of this is covered in the "Privacy" section.
Under "Photograph" there is only "Photo URL", but then you are given a choice between a photo "From your computer" or "From the web". This is a tricky one, but I suppose that if I am interested in privacy, I should either leave this blank or post something vague, amusing, or thought-provoking.
For "Audio Clip" there is only "Audio Clip URL". This is even more puzzling, and I will probably leave it blank for the forseeable future.
General includes "Gender" (with boxes for male, female, and "not specified"), "Birthday" (with an option to leave the YEAR blank, and a checkbox for "Show astrological signs"), "Homepage URL", "Wishlist URL" (with an option to "Create a wishlist"), and "IM username" (with a pull-down list of different IM services). Privacy concerns one again come into play, and there is also the question of relevance. For now, I guess I will admit that I am male. I have a couple different web sites, but none that are specifically about ME, and posting those would also raise privacy concerns. I might decide to post my astrological sign, but posting your astrological sign leads not-necessarily-correct assumptions about how I feel about astrological signs.
Then there is "Location" with "City/Town", "Region/State", and a pull-down list for "Country". I live in a small city in the midwestern USA -- beyond that, questions of privacy and relevance kick in.
"Work" includes a pull-down list for "Industry" and a blank to fill in for "Occupation". Personally, I consider this to be among the most complicated issues -- the entire issue of to what extent our "occupation" is relevant to who we are. I suppose I will deal with this in blog postings, but it's too complicated for a pull-down list and a filled-in blank.
Finally there is "Extended Info". This area provides larger blank boxes, and instructions to separate things by commas for the categories of "Interests", "Favorite Movies", "Favorite Music", and "Favorite Books". Actually, between the "Interests" and "Favorite Movies" is the "About Me" section, with a note that you may write as little or as much as you'd like, up to 1200 characters. The section concludes with the mysterious "Random Question" -- at this point, I can only assume Blogger generates a question. The big problem with these boxes is that without a detailed, complicated discussion, its easy for someone glancing at the lists to come up with wrong assumptions. Once in college I identified the works of Shakespeare as among my favorites, and a suspicious instructor asked WHY I like Shakespeare. When I mentioned that I found Shakespeare to be funny, the instructor replied that "Yes, his comedies are funny," and I quickly clarified that I thought the Shakespeare's TRAGEDIES were funny, much to the instructor's consternation.
Still, I will try and include some info about this stuff: Amongst my interests, in no particular order, are history and travel and animals and music, to name a few. I rarely attend movies in theaters any more, and, as a general rule, prefer somewhat older movies to more recent. This applies especially to horror movies -- if I get a chance to watch a Boris Karloff movie, I will, and will definitely choose it over any more recent "slasher" flick. This brings up the fact that since I do not rent or attend movies, I am pretty much limited to what they show on television, and they tend to no longer show many of my favorites, including comedies like WC Fields or the Marx Brothers -- neither of which I have seen in decades. I like old war movies, especially the stereotypical ones from WWII where a squad of soldiers is on some particular mission, like blowing up an ammo dump. I have a special fondness for the early Clint Eastwood "spaghetti westerns", which are among the most recent movies on my "favorites" list. I like James Bond movies but I have not seen the most recent ones. It is perhaps relevant to admit that I have not seen many of the most popular movies of the last forty years -- I have never seen any of the Godfather movies, and though I saw the original Star Wars I have not seen the sequals, and I have never seen "Apocalypse Now" though I like "Heart of Darkness".
I guess that moves us into the "favorite books" area (I will return to "favorite music"). I do not read nearly as much as I would like. In recent years I suppose I have read more books on dog training than any other topic. In general, I prefer non-fiction. I possess an extensive library of books about polar exploration -- "Scott of the Antarctic" is one of my heroes -- and also like books about adventures like mountain climbing and exotic travel. As with movies, I tend to favor older over newer books. In my younger days, book sales where libraries unloaded their older books made me giddy, and I have a LOT of books.
"Favorite Music" is an especially complicated topic for me. Though my music tastes vary from moment to moment, I like to think I have a certain appreciation for MOST music, with the general category of "rap" being a notable exception. One of the problems with discussing music is that you need to possess a certain basic knowledge to even know what you DON'T like. For example, I am not sure that I have much appreciation for "heavy metal", but I am also not sure that I know enough about what constitutes "heavy metal" to say that I don't like it. I am not a big fan of opera, especially Italian Opera. With time, I have developed a certain begrudging acceptance for some of the less melodic 20th century classical music, though I still have my doubts about whether all of it should be called "music".
There is so much music that I LIKE that it is overwhelming to try and list it. I like pop, rock (though my tastes now tend toward "oldies" rock), country (though I have not listened much to the country music produced over the last decade or two), classical, jazz (favoring dixieland and 1920s Chicago style over more recent varieties), and folk (I enjoy things like Irish music and Peruvian folk bands, but don't have much expertise in these areas). I rarely attend Broadway-style shows, and do not consider myself much of a fan, but I attended an amazing "Phantom of the Opera" performance and now have at least a partial appreciation for the genre.
My listening tastes seem to be influenced by the music that I can actually perform myself. I play various instruments, and sing for myself (I rarely sing publicly), and I find that I tend to listen to music that I can in some way perform. In my earlier days I played in a rock band, and I still play in brass bands and orchestras and a variety of smaller groups. One interesting result is that I more often listen to male vocalists, such as James Taylor or Gordon Lightfoot, than female vocalists, such as Joni Mitchell or Anne Murray, since it works better for ME to try and duplicate their songs. Some of my favorite recording artists are slightly obscure folksy types like John Prine and Steve Goodman and David Bromberg.
This leaves the "About Me" section, with its 1200 character limit. Though I am sure you can come up with a lot of thought-provoking discussions by trying to briefly describe yourself, I am not sure it results in anything approaching truth or accuracy. Since one of my most basic beliefs is that words are inherently unclear, and leave room for misinterpretation, I tend to use a LOT of words in an attempt to be clear. I find it very difficult to be comfortable with limiting my "About Me" section to 1200 characters, but I suppose that is the ONE area of the template I will attempt to fill in.
Truth is complicated.
When you click on "Edit Profile" it brings up eight sections: Privacy, Identity, Photograph, Audio Clip, General, Location, Work, and Extended Info.
Under "Privacy", there are checkboxes for "Share my profile", "Show my real name", "Show my email address", "Show my blogs", and "Show sites I follow". The entire "Share my profile" question would seem to depend partly on what information my profile contains. For now, I have decided to keep my identity at least somewhat private, for a variety of complicated reasons -- so I won't be showing my real name, however I will show an e-mail address, though e-mail inboxes can quickly fill with spam. My first blogs were pretty much "practice" blogs, so there doesn't seem to be any point in showing them. At the moment, I don't particularly "follow" any sites; if I DID, I'm not sure I would want to share that info.
Under "Identity" we have "Username" (required), "Email Address", "Display Name" (required), "First Name", and "Last Name". Most of this is covered in the "Privacy" section.
Under "Photograph" there is only "Photo URL", but then you are given a choice between a photo "From your computer" or "From the web". This is a tricky one, but I suppose that if I am interested in privacy, I should either leave this blank or post something vague, amusing, or thought-provoking.
For "Audio Clip" there is only "Audio Clip URL". This is even more puzzling, and I will probably leave it blank for the forseeable future.
General includes "Gender" (with boxes for male, female, and "not specified"), "Birthday" (with an option to leave the YEAR blank, and a checkbox for "Show astrological signs"), "Homepage URL", "Wishlist URL" (with an option to "Create a wishlist"), and "IM username" (with a pull-down list of different IM services). Privacy concerns one again come into play, and there is also the question of relevance. For now, I guess I will admit that I am male. I have a couple different web sites, but none that are specifically about ME, and posting those would also raise privacy concerns. I might decide to post my astrological sign, but posting your astrological sign leads not-necessarily-correct assumptions about how I feel about astrological signs.
Then there is "Location" with "City/Town", "Region/State", and a pull-down list for "Country". I live in a small city in the midwestern USA -- beyond that, questions of privacy and relevance kick in.
"Work" includes a pull-down list for "Industry" and a blank to fill in for "Occupation". Personally, I consider this to be among the most complicated issues -- the entire issue of to what extent our "occupation" is relevant to who we are. I suppose I will deal with this in blog postings, but it's too complicated for a pull-down list and a filled-in blank.
Finally there is "Extended Info". This area provides larger blank boxes, and instructions to separate things by commas for the categories of "Interests", "Favorite Movies", "Favorite Music", and "Favorite Books". Actually, between the "Interests" and "Favorite Movies" is the "About Me" section, with a note that you may write as little or as much as you'd like, up to 1200 characters. The section concludes with the mysterious "Random Question" -- at this point, I can only assume Blogger generates a question. The big problem with these boxes is that without a detailed, complicated discussion, its easy for someone glancing at the lists to come up with wrong assumptions. Once in college I identified the works of Shakespeare as among my favorites, and a suspicious instructor asked WHY I like Shakespeare. When I mentioned that I found Shakespeare to be funny, the instructor replied that "Yes, his comedies are funny," and I quickly clarified that I thought the Shakespeare's TRAGEDIES were funny, much to the instructor's consternation.
Still, I will try and include some info about this stuff: Amongst my interests, in no particular order, are history and travel and animals and music, to name a few. I rarely attend movies in theaters any more, and, as a general rule, prefer somewhat older movies to more recent. This applies especially to horror movies -- if I get a chance to watch a Boris Karloff movie, I will, and will definitely choose it over any more recent "slasher" flick. This brings up the fact that since I do not rent or attend movies, I am pretty much limited to what they show on television, and they tend to no longer show many of my favorites, including comedies like WC Fields or the Marx Brothers -- neither of which I have seen in decades. I like old war movies, especially the stereotypical ones from WWII where a squad of soldiers is on some particular mission, like blowing up an ammo dump. I have a special fondness for the early Clint Eastwood "spaghetti westerns", which are among the most recent movies on my "favorites" list. I like James Bond movies but I have not seen the most recent ones. It is perhaps relevant to admit that I have not seen many of the most popular movies of the last forty years -- I have never seen any of the Godfather movies, and though I saw the original Star Wars I have not seen the sequals, and I have never seen "Apocalypse Now" though I like "Heart of Darkness".
I guess that moves us into the "favorite books" area (I will return to "favorite music"). I do not read nearly as much as I would like. In recent years I suppose I have read more books on dog training than any other topic. In general, I prefer non-fiction. I possess an extensive library of books about polar exploration -- "Scott of the Antarctic" is one of my heroes -- and also like books about adventures like mountain climbing and exotic travel. As with movies, I tend to favor older over newer books. In my younger days, book sales where libraries unloaded their older books made me giddy, and I have a LOT of books.
"Favorite Music" is an especially complicated topic for me. Though my music tastes vary from moment to moment, I like to think I have a certain appreciation for MOST music, with the general category of "rap" being a notable exception. One of the problems with discussing music is that you need to possess a certain basic knowledge to even know what you DON'T like. For example, I am not sure that I have much appreciation for "heavy metal", but I am also not sure that I know enough about what constitutes "heavy metal" to say that I don't like it. I am not a big fan of opera, especially Italian Opera. With time, I have developed a certain begrudging acceptance for some of the less melodic 20th century classical music, though I still have my doubts about whether all of it should be called "music".
There is so much music that I LIKE that it is overwhelming to try and list it. I like pop, rock (though my tastes now tend toward "oldies" rock), country (though I have not listened much to the country music produced over the last decade or two), classical, jazz (favoring dixieland and 1920s Chicago style over more recent varieties), and folk (I enjoy things like Irish music and Peruvian folk bands, but don't have much expertise in these areas). I rarely attend Broadway-style shows, and do not consider myself much of a fan, but I attended an amazing "Phantom of the Opera" performance and now have at least a partial appreciation for the genre.
My listening tastes seem to be influenced by the music that I can actually perform myself. I play various instruments, and sing for myself (I rarely sing publicly), and I find that I tend to listen to music that I can in some way perform. In my earlier days I played in a rock band, and I still play in brass bands and orchestras and a variety of smaller groups. One interesting result is that I more often listen to male vocalists, such as James Taylor or Gordon Lightfoot, than female vocalists, such as Joni Mitchell or Anne Murray, since it works better for ME to try and duplicate their songs. Some of my favorite recording artists are slightly obscure folksy types like John Prine and Steve Goodman and David Bromberg.
This leaves the "About Me" section, with its 1200 character limit. Though I am sure you can come up with a lot of thought-provoking discussions by trying to briefly describe yourself, I am not sure it results in anything approaching truth or accuracy. Since one of my most basic beliefs is that words are inherently unclear, and leave room for misinterpretation, I tend to use a LOT of words in an attempt to be clear. I find it very difficult to be comfortable with limiting my "About Me" section to 1200 characters, but I suppose that is the ONE area of the template I will attempt to fill in.
Truth is complicated.
Thursday, October 14, 2010
"You lie!"
Thirteen months ago, as the President of the United States was making a speech to a joint session of Congress, one of the Congressmen shouted out "You lie!" I am deliberately not mentioning the name of the President, or of the Congressman, or the subject or context. I believe all those things distract from what is truly interesting about this incident.
The moment was replayed over and over again on countless newscasts. Since it was a Presidential speech to a joint session of Congress, the camera was focused on the President, with the Speaker of the House (who happened to be from the same party as the President) seated behind the President, visible in the same camera shot. Perhaps the most commented-on aspect of the video replay was the look on the face of the Speaker of the House -- more dramatic than the look on the face of the President.
The incident generated a sort of frenzy in the media, with much discussion. It was reported that there was a significant increase in donations to both the involved Congressman and his opponent in the next election. Eventually there were official congressional proceedings against the Congressman, and he received some sort of official reprimand.
The entire issue was very emotional, though there is room for debate about how much was genuine emotion and how much was contrived. The Congressman who shouted out "You lie!" MAY have been overcome with emotion at that moment, or he may have been planning his outburst for weeks. All that followed MAY have been motivated by genuine outrage, or it may have been the product of cold political calculation.
For me, the biggest revelation of the entire incident was that Congress has specific rules against saying "You lie!" to the President, as well as against making various similar statements. This raises a number of obvious questions regarding the subject of what, if anything, a Congressman is allowed to say or do if a President lies. Perhaps the rules are meant to acknowledge that the President of the United States is somehow incapable of lying.
For me, the most interesting aspect of all that followed the incident was that there was little public discussion of the question of whether or not the President had been lying just prior to the moment when the Congressman shouted the words. Certainly it can be argued that since Congress has rules against a Congressman shouting "You lie!" at the President, and the Congressman shouted "You lie!" at the President, he broke the rules, and any discussion of whether or not the President was lying is irrelevant. Still, I would have thought that if there is any sort of objectivity or intellectual curiosity in our society, the question would have been debated in the media.
I must acknowledge that I DID hear the question raised ONCE on a news program. A person defending the President stated that at the moment in question, the President was speaking purely hypothetically, about something that existed only in his own mind, and therefore the Congressman could not absolutely state that he was lying, since the President was not speaking about anything real. This defense is interesting, but problematic on various levels given the specifics of what he was saying, and also the follow-up actions on the part of his supporters.
I would also have enjoyed hearing some debate about whether Congress should continue to have rules against saying "You lie!" to the President. The question MUST have come up, but I never heard it raised. Most of the debate that I heard was with regard to what actions should be taken following the statement -- how much apologizing should be done, and to whom, and how much punishment should be meted out. Perhaps these questions were necessary, but I could not help viewing them as simply deliberate distractions from the questions that should have been addressed.
The moment was replayed over and over again on countless newscasts. Since it was a Presidential speech to a joint session of Congress, the camera was focused on the President, with the Speaker of the House (who happened to be from the same party as the President) seated behind the President, visible in the same camera shot. Perhaps the most commented-on aspect of the video replay was the look on the face of the Speaker of the House -- more dramatic than the look on the face of the President.
The incident generated a sort of frenzy in the media, with much discussion. It was reported that there was a significant increase in donations to both the involved Congressman and his opponent in the next election. Eventually there were official congressional proceedings against the Congressman, and he received some sort of official reprimand.
The entire issue was very emotional, though there is room for debate about how much was genuine emotion and how much was contrived. The Congressman who shouted out "You lie!" MAY have been overcome with emotion at that moment, or he may have been planning his outburst for weeks. All that followed MAY have been motivated by genuine outrage, or it may have been the product of cold political calculation.
For me, the biggest revelation of the entire incident was that Congress has specific rules against saying "You lie!" to the President, as well as against making various similar statements. This raises a number of obvious questions regarding the subject of what, if anything, a Congressman is allowed to say or do if a President lies. Perhaps the rules are meant to acknowledge that the President of the United States is somehow incapable of lying.
For me, the most interesting aspect of all that followed the incident was that there was little public discussion of the question of whether or not the President had been lying just prior to the moment when the Congressman shouted the words. Certainly it can be argued that since Congress has rules against a Congressman shouting "You lie!" at the President, and the Congressman shouted "You lie!" at the President, he broke the rules, and any discussion of whether or not the President was lying is irrelevant. Still, I would have thought that if there is any sort of objectivity or intellectual curiosity in our society, the question would have been debated in the media.
I must acknowledge that I DID hear the question raised ONCE on a news program. A person defending the President stated that at the moment in question, the President was speaking purely hypothetically, about something that existed only in his own mind, and therefore the Congressman could not absolutely state that he was lying, since the President was not speaking about anything real. This defense is interesting, but problematic on various levels given the specifics of what he was saying, and also the follow-up actions on the part of his supporters.
I would also have enjoyed hearing some debate about whether Congress should continue to have rules against saying "You lie!" to the President. The question MUST have come up, but I never heard it raised. Most of the debate that I heard was with regard to what actions should be taken following the statement -- how much apologizing should be done, and to whom, and how much punishment should be meted out. Perhaps these questions were necessary, but I could not help viewing them as simply deliberate distractions from the questions that should have been addressed.
Wednesday, October 6, 2010
Television scheduling
Television is a complicated topic. While it is easy to attack “television” and its effect on society, there are many positive aspects to television -- but that’s not what I intend to write about.
Good or bad, television is currently a part of our society. It would be wise to acknowledge the importance of television in our society, and also to examine aspects such as scheduling, the importance of which I believe is largely overlooked. (I acknowledge that “importance” is a complicated topic, and to a certain extent in the eye of the beholder.)
For one thing, I suspect the differences television scheduling in different time zones have some marked effects on peoples day-to-day lives. Growing up in the Midwest, “prime-time” television was from seven to ten in the evening, and at ten came the local news. On the east coast, prime time is from eight to eleven, with the local news at eleven. It is difficult for me to believe this does not have a significant effect, causing people in the different regions to live according to a different schedule. Even people who never watch a moment of television are affected by the fact that OTHER people watch television, and watch it according to the networks’ schedule.
This is the time of year when programming executives most often feel compelled to change their schedules. The most fanfare is given to brand-new programs, but it is also a time when older programs are canceled or shifted to different time slots, often on different days. All of this has an effect on daily life, especially when dealing with exceptionally popular programs. Again, even people who never watch a moment of television are affected when millions of people who have been watching a program on one night of the week abruptly shift to watching it on another night of the week.
While many scheduling changes are made at the national network level, local programming executives also make life-altering changes, though these changes are even harder to document and study. I believe that when a local channel that has been broadcasting something like “Seinfeld” or “MASH” every weeknight at a certain time suddenly switches to broadcasting “Friends” or “Everybody Loves Raymond”, this also has appreciable effects on the day-to-day lives of people in the broadcasting area -- though without studying this phenomenon, I cannot say what these effects might be.
I am not ashamed to admit that I enjoy television. Among other things, I enjoy using television as a sort of clock or calendar, to mark distinct points in the day or week. My elderly mother is losing many of her cognitive abilities, but she still knows that at four o’clock on weekdays she will watch “Jeopardy”, but NOT on weekends.
Although I watch less television now than sometimes in the past, I continue to watch television late at night as I end my day and prepare for sleep. Most of the channels I favor late at night show the same programs every weeknight, so I can strive to finish my daily activities in time to watch “Star Trek” or “Jay Leno” or whatever. I am disappointed, sometimes severely, if I sit down to watch one of these “usual” late-night shows and find that for some reason it is not being broadcast (in particular, one channel often changes their regular late-night schedule to show sporting activities). And, just as my mother is disappointed by the lack of “Jeopardy” on weekends, I am disappointed by the absence of my “usual” late-night shows on weekends.
To be honest, I suppose I would prefer that the channels that show the same programming FIVE days a week would show it SEVEN days a week, but I realize that in general, they are not going to DO that. However, I remain puzzled by the fact that these channels consider late night Friday, and the early AM hours of Saturday, to be “weeknight”, while late night Sunday (and the early AM hours of Monday) are considered “weekend”. For most Americans, Friday night is part of “the weekend”. To be fair, Sunday night may also be considered part of “the weekend”, but the later part of Sunday is spent preparing for a weekday, as is the later part of Monday, Tuesday, Wednesday, and Thursday, while the later part of Friday is spent preparing for ... more of the weekend. As far as I am concerned, it makes much more sense, and would be much better, to have the “weeknight” programming late Sunday night and early Monday morning, and show the “weekend” programs late Friday and into Saturday morning. Perhaps the most controversial thing I have to say regarding this topic is that I truly believe that if the television programming executives were to make this change, we would live in a better, happier, more productive world.
I admit that various technical developments, such as the increase in the number of television “channels” from three to hundreds, recording capabilities such as VCRs and DVRs, and the ability to watch things “on-demand”, make these scheduling issues decrease in importance -- though they raise an entirely NEW set of issues for society, to be addressed in a future posting.
Truth is complicated.
Good or bad, television is currently a part of our society. It would be wise to acknowledge the importance of television in our society, and also to examine aspects such as scheduling, the importance of which I believe is largely overlooked. (I acknowledge that “importance” is a complicated topic, and to a certain extent in the eye of the beholder.)
For one thing, I suspect the differences television scheduling in different time zones have some marked effects on peoples day-to-day lives. Growing up in the Midwest, “prime-time” television was from seven to ten in the evening, and at ten came the local news. On the east coast, prime time is from eight to eleven, with the local news at eleven. It is difficult for me to believe this does not have a significant effect, causing people in the different regions to live according to a different schedule. Even people who never watch a moment of television are affected by the fact that OTHER people watch television, and watch it according to the networks’ schedule.
This is the time of year when programming executives most often feel compelled to change their schedules. The most fanfare is given to brand-new programs, but it is also a time when older programs are canceled or shifted to different time slots, often on different days. All of this has an effect on daily life, especially when dealing with exceptionally popular programs. Again, even people who never watch a moment of television are affected when millions of people who have been watching a program on one night of the week abruptly shift to watching it on another night of the week.
While many scheduling changes are made at the national network level, local programming executives also make life-altering changes, though these changes are even harder to document and study. I believe that when a local channel that has been broadcasting something like “Seinfeld” or “MASH” every weeknight at a certain time suddenly switches to broadcasting “Friends” or “Everybody Loves Raymond”, this also has appreciable effects on the day-to-day lives of people in the broadcasting area -- though without studying this phenomenon, I cannot say what these effects might be.
I am not ashamed to admit that I enjoy television. Among other things, I enjoy using television as a sort of clock or calendar, to mark distinct points in the day or week. My elderly mother is losing many of her cognitive abilities, but she still knows that at four o’clock on weekdays she will watch “Jeopardy”, but NOT on weekends.
Although I watch less television now than sometimes in the past, I continue to watch television late at night as I end my day and prepare for sleep. Most of the channels I favor late at night show the same programs every weeknight, so I can strive to finish my daily activities in time to watch “Star Trek” or “Jay Leno” or whatever. I am disappointed, sometimes severely, if I sit down to watch one of these “usual” late-night shows and find that for some reason it is not being broadcast (in particular, one channel often changes their regular late-night schedule to show sporting activities). And, just as my mother is disappointed by the lack of “Jeopardy” on weekends, I am disappointed by the absence of my “usual” late-night shows on weekends.
To be honest, I suppose I would prefer that the channels that show the same programming FIVE days a week would show it SEVEN days a week, but I realize that in general, they are not going to DO that. However, I remain puzzled by the fact that these channels consider late night Friday, and the early AM hours of Saturday, to be “weeknight”, while late night Sunday (and the early AM hours of Monday) are considered “weekend”. For most Americans, Friday night is part of “the weekend”. To be fair, Sunday night may also be considered part of “the weekend”, but the later part of Sunday is spent preparing for a weekday, as is the later part of Monday, Tuesday, Wednesday, and Thursday, while the later part of Friday is spent preparing for ... more of the weekend. As far as I am concerned, it makes much more sense, and would be much better, to have the “weeknight” programming late Sunday night and early Monday morning, and show the “weekend” programs late Friday and into Saturday morning. Perhaps the most controversial thing I have to say regarding this topic is that I truly believe that if the television programming executives were to make this change, we would live in a better, happier, more productive world.
I admit that various technical developments, such as the increase in the number of television “channels” from three to hundreds, recording capabilities such as VCRs and DVRs, and the ability to watch things “on-demand”, make these scheduling issues decrease in importance -- though they raise an entirely NEW set of issues for society, to be addressed in a future posting.
Truth is complicated.
Thursday, September 23, 2010
Trisomy 21
In an earlier post, I mentioned the problem of attempting to communicate using terminology that, while ostensibly clear, is so potentially offensive or inflammatory that it interferes with communication. This problem is difficult to even discuss, because of the need to reference terminology that is potentially offensive or inflammatory. Still, I view it as an important problem. I apologize in advance to those I may be about to offend, and I suggest you might want to just skip this posting.
This problem is perhaps most often encountered when dealing with matters of race, ethnicity, or anything else concerning broad groups of people. For me, the very best example comes from dealing with the subject of the human genetic abnormality most commonly caused by being born with three rather than two twenty-first chromosomes, technically referred to as “Trisomy 21”. (I have NO expertise on this condition. I am simply using it as an example of a subject that can raise difficulties with communication.)
When I was a child, I learned that people with a certain genetic abnormality were referred to as “mongoloid”. I was taught that people with this genetic abnormality faced varying degrees of mental and physical challenges, but were normal people who lived normal lives.*
Years later, I was told that the term “mongoloid” was offensive and demeaning, and should never be used, and should be basically replaced with the term “Down’s Syndrome”. When I asked WHAT was so offensive and demeaning about the term “mongoloid”, I was told that, among other things, the full term was “mongolian idiot” or “mongolian idiocy”. (Curiously, though I have heard the term “mongoloid” dozens or even hundreds of times used by a wide variety of people, the ONLY people I have ever heard use the term “mongolian idiot” are those advocating the use of the term “Down’s Syndrome” ... that is, the ONLY people who I have heard USE the term are the ones who say it is deeply offensive.)
For several years, this created a difficult situation. If I used the term “Down’s Syndrome”, most of the people I encountered had no idea what I was talking about, and I would have to say something to the effect of, “You know, mongoloid.” I struggled with this issue, as my choices seemed to be limited to either being misunderstood, or using an offensive, demeaning term. One of my coping strategies during this period was to use the scientific term for the genetic disorder, “Trisomy 21”, but that term was understood by even fewer people than the term “Down’s Syndrome”.
Over the years, this particular problem seemed to resolve itself as the term “Down’s Syndrome” became widely used and understood, though as I was writing this I did some online research and apparently there has been a movement to replace “Down’s Syndrome” with “Down Syndrome”.
I still struggle with the underlying philosophical problem of how to communicate clearly when some of the best-understood terms are considered offensive or inflammatory. I continue to believe that clear communication should be the priority -- but offensive terms, even if easily understood, can sometimes stand in the way of clear communication. There is more to clear communication than being correct, and sometimes there is even more than being clear.
Truth is complicated.
* As I was typing, I automatically placed the word "normal" in quotation marks, to acknowledge that the entire concept of "normal" is always a bit dubious and open to interpretation. Later, I realized my quotation marks might be viewed as expressing reservations about the "normalcy" of those with Down Syndrome, so I have removed them.
This problem is perhaps most often encountered when dealing with matters of race, ethnicity, or anything else concerning broad groups of people. For me, the very best example comes from dealing with the subject of the human genetic abnormality most commonly caused by being born with three rather than two twenty-first chromosomes, technically referred to as “Trisomy 21”. (I have NO expertise on this condition. I am simply using it as an example of a subject that can raise difficulties with communication.)
When I was a child, I learned that people with a certain genetic abnormality were referred to as “mongoloid”. I was taught that people with this genetic abnormality faced varying degrees of mental and physical challenges, but were normal people who lived normal lives.*
Years later, I was told that the term “mongoloid” was offensive and demeaning, and should never be used, and should be basically replaced with the term “Down’s Syndrome”. When I asked WHAT was so offensive and demeaning about the term “mongoloid”, I was told that, among other things, the full term was “mongolian idiot” or “mongolian idiocy”. (Curiously, though I have heard the term “mongoloid” dozens or even hundreds of times used by a wide variety of people, the ONLY people I have ever heard use the term “mongolian idiot” are those advocating the use of the term “Down’s Syndrome” ... that is, the ONLY people who I have heard USE the term are the ones who say it is deeply offensive.)
For several years, this created a difficult situation. If I used the term “Down’s Syndrome”, most of the people I encountered had no idea what I was talking about, and I would have to say something to the effect of, “You know, mongoloid.” I struggled with this issue, as my choices seemed to be limited to either being misunderstood, or using an offensive, demeaning term. One of my coping strategies during this period was to use the scientific term for the genetic disorder, “Trisomy 21”, but that term was understood by even fewer people than the term “Down’s Syndrome”.
Over the years, this particular problem seemed to resolve itself as the term “Down’s Syndrome” became widely used and understood, though as I was writing this I did some online research and apparently there has been a movement to replace “Down’s Syndrome” with “Down Syndrome”.
I still struggle with the underlying philosophical problem of how to communicate clearly when some of the best-understood terms are considered offensive or inflammatory. I continue to believe that clear communication should be the priority -- but offensive terms, even if easily understood, can sometimes stand in the way of clear communication. There is more to clear communication than being correct, and sometimes there is even more than being clear.
Truth is complicated.
* As I was typing, I automatically placed the word "normal" in quotation marks, to acknowledge that the entire concept of "normal" is always a bit dubious and open to interpretation. Later, I realized my quotation marks might be viewed as expressing reservations about the "normalcy" of those with Down Syndrome, so I have removed them.
Friday, September 17, 2010
Correctness versus Communication
The discussion of the debate over whether to call certain guitar chords “barre” or “bar” has pushed me to contemplate the idea that sometimes “correctness” conflicts with “communication”.
These days, any mention of “correctness” with regard to “communication” brings up the idea of “politically correct” -- which is actually not my focus here, but I guess I need to briefly address it.
Though phrases like “politically incorrect” or “politically correct” or even the abbreviation “P.C.” get tossed around a lot, the precise definition of these terms is not widely agreed upon. I am generally a fan of Merriam-Webster, and their online dictionary defines “politically correct” as “conforming to a belief that language and practices which could offend political sensibilities (as in matters of sex or race) should be eliminated”. A similar definition is posted in the “Free Online Dictionary” -- “Of, relating to, or supporting broad social, political, and educational change, especially to redress historical injustices in matters such as race, class, gender, and sexual orientation.”
Various sources cite the fact that the phrase “politically correct” is general used in a disparaging manner -- it is commonly thought to be a BAD thing to be “politically correct”. Without going into a thorough discussion, I will agree that I consider “political correctness” to, in general, be a negative thing, if for no other reason than it inhibits free-flowing communication -- but I acknowledge that political correctness is a complicated topic.
My focus here, though, is actually other forms of correctness, and the extent to which they can interfere with communication. Here are two real-life examples:
My late father was a physician, and regularly attended medical conferences where guest experts would lecture on medical topics. There was a certain lecturer who my father particularly enjoyed, and encountered various times over the years. Sadly, I do not know this lecturer’s name. My father mentioned that eventually this lecturer became totally blind, but still delivered flawless commentaries to his slide shows, from memory. At one lecture, when referring to the large intestine, or “colon”, the lecturer repeatedly pronounced “colon” in an unusual manner, placing equal accent on both syllables, and saying “ahn” rather than “in” or “en” for the second syllable. At the end of the lecture, a member of the audience criticized the lecturer for “mispronouncing” the word “colon”. As my father related the story to me, the lecturer responded that he prepared his lectures very carefully, rehearsing and fine-tuning before a variety of listeners. He had discovered that when he pronounced “colon” CORRECTLY, his listeners often had trouble understanding what word he was saying. When he MISPRONOUNCED it, they understood he was saying the word “colon”, though they believed he was mispronouncing it. So, in the interest of clearer communication, he now consciously mispronounced the word “colon”.
When I tell people the story of this deliberate mispronunciation, they often seem to find it troubling. I find it difficult to argue with the goal of clear communication.
Example number two: I knew an intelligent, well-educated, highly skilled musician, who had strong feelings about the terms used to name her primary instrument, which many people would call the “French horn”. My friend insisted that the correct term was simply “horn”, citing, among other things, the idea that there has never been anything particularly “French” about the “French horn”. I should note that I never completely understood her objections to the term “French horn”; perhaps I am misrepresenting her views. Personally, I lack the expertise to know whether it is truly incorrect to refer to the instrument as a “French horn”. I know that among orchestral musicians, the word “horn” is sometimes adequate to refer to that particular instrument. Unfortunately, to many people, in many contexts, the word “horn” refers to a wide variety of instruments, rather than just the “French horn”, including trumpets, trombones, and tubas, and sometimes even things like saxophones and clarinets. So, when people would ask my friend what instrument she played, and she would reply “horn”, they would often follow up by asking “WHICH horn?” My friend, though possibly being “correct”, was standing in the way of clear communication, refusing to use the terms which would be understood by the larger audience.
These two examples show two sides of the problem, or at least two totally different ways of dealing with it. The medical lecturer believed his top priority was to be understood, and was willing to be viewed as guilty of mispronouncing a word. The musician was more concerned about using the “correct” terminology for her instrument, and was willing to sacrifice clarity of communication and possibly necessitate follow-up questions. As with so many subjects, there are differences regarding personal priorities.
This is a complicated issue, and I do not claim to have the answers.
There is a separate but related issue regarding communication that is so potentially offensive or inflammatory that the clarity is impeded, as in matters of race and ethnicity, among other things. I had intended to go ahead and discuss that issue, but have decided to wait, so as to not risk impeding the clarity of what I have already written.
Truth is complicated.
These days, any mention of “correctness” with regard to “communication” brings up the idea of “politically correct” -- which is actually not my focus here, but I guess I need to briefly address it.
Though phrases like “politically incorrect” or “politically correct” or even the abbreviation “P.C.” get tossed around a lot, the precise definition of these terms is not widely agreed upon. I am generally a fan of Merriam-Webster, and their online dictionary defines “politically correct” as “conforming to a belief that language and practices which could offend political sensibilities (as in matters of sex or race) should be eliminated”. A similar definition is posted in the “Free Online Dictionary” -- “Of, relating to, or supporting broad social, political, and educational change, especially to redress historical injustices in matters such as race, class, gender, and sexual orientation.”
Various sources cite the fact that the phrase “politically correct” is general used in a disparaging manner -- it is commonly thought to be a BAD thing to be “politically correct”. Without going into a thorough discussion, I will agree that I consider “political correctness” to, in general, be a negative thing, if for no other reason than it inhibits free-flowing communication -- but I acknowledge that political correctness is a complicated topic.
My focus here, though, is actually other forms of correctness, and the extent to which they can interfere with communication. Here are two real-life examples:
My late father was a physician, and regularly attended medical conferences where guest experts would lecture on medical topics. There was a certain lecturer who my father particularly enjoyed, and encountered various times over the years. Sadly, I do not know this lecturer’s name. My father mentioned that eventually this lecturer became totally blind, but still delivered flawless commentaries to his slide shows, from memory. At one lecture, when referring to the large intestine, or “colon”, the lecturer repeatedly pronounced “colon” in an unusual manner, placing equal accent on both syllables, and saying “ahn” rather than “in” or “en” for the second syllable. At the end of the lecture, a member of the audience criticized the lecturer for “mispronouncing” the word “colon”. As my father related the story to me, the lecturer responded that he prepared his lectures very carefully, rehearsing and fine-tuning before a variety of listeners. He had discovered that when he pronounced “colon” CORRECTLY, his listeners often had trouble understanding what word he was saying. When he MISPRONOUNCED it, they understood he was saying the word “colon”, though they believed he was mispronouncing it. So, in the interest of clearer communication, he now consciously mispronounced the word “colon”.
When I tell people the story of this deliberate mispronunciation, they often seem to find it troubling. I find it difficult to argue with the goal of clear communication.
Example number two: I knew an intelligent, well-educated, highly skilled musician, who had strong feelings about the terms used to name her primary instrument, which many people would call the “French horn”. My friend insisted that the correct term was simply “horn”, citing, among other things, the idea that there has never been anything particularly “French” about the “French horn”. I should note that I never completely understood her objections to the term “French horn”; perhaps I am misrepresenting her views. Personally, I lack the expertise to know whether it is truly incorrect to refer to the instrument as a “French horn”. I know that among orchestral musicians, the word “horn” is sometimes adequate to refer to that particular instrument. Unfortunately, to many people, in many contexts, the word “horn” refers to a wide variety of instruments, rather than just the “French horn”, including trumpets, trombones, and tubas, and sometimes even things like saxophones and clarinets. So, when people would ask my friend what instrument she played, and she would reply “horn”, they would often follow up by asking “WHICH horn?” My friend, though possibly being “correct”, was standing in the way of clear communication, refusing to use the terms which would be understood by the larger audience.
These two examples show two sides of the problem, or at least two totally different ways of dealing with it. The medical lecturer believed his top priority was to be understood, and was willing to be viewed as guilty of mispronouncing a word. The musician was more concerned about using the “correct” terminology for her instrument, and was willing to sacrifice clarity of communication and possibly necessitate follow-up questions. As with so many subjects, there are differences regarding personal priorities.
This is a complicated issue, and I do not claim to have the answers.
There is a separate but related issue regarding communication that is so potentially offensive or inflammatory that the clarity is impeded, as in matters of race and ethnicity, among other things. I had intended to go ahead and discuss that issue, but have decided to wait, so as to not risk impeding the clarity of what I have already written.
Truth is complicated.
Thursday, September 9, 2010
Blog First Anniversary
This blog turns one year old today. This is of debatable importance, but it does offer a good opportunity for reflection.
One of my primary reasons for doing this blog is TO LEARN TO DO A BLOG. In the words of Michelangelo, “Still I am learning.” In some ways, I have just begun to scratch the surface of blogging. For instance, so far I have posted only text -- no audio or visual data. Only recently have I begun to post links to other URLs, and in particular to a blog at which I posted a comment.
This brings up the idea that I believe only a few people have actually READ anything on this blog. I do not advertise it, and I have not informed even my friends and relatives of its title or URL, so visitors are quite rare, though there are some attempts to leave spam/ad comments. The entire “comments” aspect is one of the things I am learning about blogging. For now, I enjoy the occasional comment, both for learning about the blog process, and for stimulating ideas and discussion, but I am fine with having an extremely limited audience.
Overall, I am satisfied with the first year of this blog, though in many ways it is sort of running behind schedule. It has been a busier year than I anticipated -- non-blog-wise -- and there has not been much time for writing or for learning the intricacies of blogging. Still, I AM learning.
There are three blogging problems for which I have not found any solutions:
1) Realistically, I probably do not have enough “free” time to do this blog. Free time is a complicated topic, and too complicated to discuss here (because, among other things, there is not enough TIME ...).
2) To a certain extent, everything I write in this blog builds upon what I have already written. Yet I cannot assume that someone reading one of my blog entries has read everything leading up to it. In my mind, none of my blog entries truly stand alone. As a minimum, all are dependent upon certain basic concepts, like the idea that written communication is fundamentally unclear, and the fact that I believe the rule about always placing periods inside of quotation marks is WRONG. More specifically, I sometimes realize as I am writing that I need to explore some other idea FIRST -- so I stop writing and instead write about that first idea before writing about the second idea ... but there is no guarantee that someone reading the second idea will have read the first idea.
3) I have strong opinions about politics, and would like to share them, but among my strong opinions is the idea that we have become so polarized politically that many people automatically reject any idea that they believe comes from “the opposing side”. I truly believe that if George W. Bush and Barack Obama were to give precisely the same speech, Republicans would praise Bush’s speech and condemn Obama’s, while Democrats would praise Obama’s and condemn Bush’s. Furthermore, I believe this goes way beyond politics. Once I start stating my opinions on politics, those who perceive that they are on “the other side” may not even seriously consider my opinions on music or dog-training. One of the main points of this blog is COMMUNICATION. I cannot communicate if people refuse to consider my ideas.
As I said, I have no ready solutions for any of these problems. Regarding time, I will probably just struggle along with slightly lowered expectations. I hope, on average, to post one new blog entry each week. Regarding the fact that each blog entry builds upon the previous ones, I will perhaps try to link back to some of my earlier entries. In some ways, I keep stating the same ideas over and over again, so perhaps it doesn’t matter so much whether anyone has read any earlier entries. Regarding the problem of alienating people who then decide to automatically reject my opinions, there is probably nothing I can do, so I hope to go ahead and post more on politics and other controversial topics in the coming year, especially considering that there is an important election coming up.
I also hope to post some sort of “about” or “profile” page. I have mixed emotions about this. I like the blog entries to speak for themselves, but I feel a sort of vague obligation to post SOMETHING about myself.
This brings up the fact that after a full year, what I refer to as “this blog” continues to exist as basically two twin blogs, with mostly identical postings. The two blog hosts have different formats, though, so some things will be different, including the “about” or “profile” page. In particular, one format encourages creating specific “pages” -- I may post a “quotations” page or a “basic truths” page on that blog, but I am not sure how I will go about posting that information in the blog that does not have specific “pages”.
I continue to wonder whether I will run out of things to write about. So far, I generally have much more in mind to write about than I have time to write, and I struggle with figuring out which topics to choose, and whether the order of topics makes a difference in clarity and understanding. I feel a bit ... well, I suppose the best word is “guilty” ... about the fact that I tend to write things well in advance of actually posting them online. It lacks a certain spontaneity, but it gives me a chance to reconsider my words, and especially to try and arrange my ideas in a certain logical order.
Perhaps I should focus more upon the fact that there is a very real question about whether most of my blog entries will ever be read by anyone else, so most of my concerns are totally academic.
Truth is complicated.
One of my primary reasons for doing this blog is TO LEARN TO DO A BLOG. In the words of Michelangelo, “Still I am learning.” In some ways, I have just begun to scratch the surface of blogging. For instance, so far I have posted only text -- no audio or visual data. Only recently have I begun to post links to other URLs, and in particular to a blog at which I posted a comment.
This brings up the idea that I believe only a few people have actually READ anything on this blog. I do not advertise it, and I have not informed even my friends and relatives of its title or URL, so visitors are quite rare, though there are some attempts to leave spam/ad comments. The entire “comments” aspect is one of the things I am learning about blogging. For now, I enjoy the occasional comment, both for learning about the blog process, and for stimulating ideas and discussion, but I am fine with having an extremely limited audience.
Overall, I am satisfied with the first year of this blog, though in many ways it is sort of running behind schedule. It has been a busier year than I anticipated -- non-blog-wise -- and there has not been much time for writing or for learning the intricacies of blogging. Still, I AM learning.
There are three blogging problems for which I have not found any solutions:
1) Realistically, I probably do not have enough “free” time to do this blog. Free time is a complicated topic, and too complicated to discuss here (because, among other things, there is not enough TIME ...).
2) To a certain extent, everything I write in this blog builds upon what I have already written. Yet I cannot assume that someone reading one of my blog entries has read everything leading up to it. In my mind, none of my blog entries truly stand alone. As a minimum, all are dependent upon certain basic concepts, like the idea that written communication is fundamentally unclear, and the fact that I believe the rule about always placing periods inside of quotation marks is WRONG. More specifically, I sometimes realize as I am writing that I need to explore some other idea FIRST -- so I stop writing and instead write about that first idea before writing about the second idea ... but there is no guarantee that someone reading the second idea will have read the first idea.
3) I have strong opinions about politics, and would like to share them, but among my strong opinions is the idea that we have become so polarized politically that many people automatically reject any idea that they believe comes from “the opposing side”. I truly believe that if George W. Bush and Barack Obama were to give precisely the same speech, Republicans would praise Bush’s speech and condemn Obama’s, while Democrats would praise Obama’s and condemn Bush’s. Furthermore, I believe this goes way beyond politics. Once I start stating my opinions on politics, those who perceive that they are on “the other side” may not even seriously consider my opinions on music or dog-training. One of the main points of this blog is COMMUNICATION. I cannot communicate if people refuse to consider my ideas.
As I said, I have no ready solutions for any of these problems. Regarding time, I will probably just struggle along with slightly lowered expectations. I hope, on average, to post one new blog entry each week. Regarding the fact that each blog entry builds upon the previous ones, I will perhaps try to link back to some of my earlier entries. In some ways, I keep stating the same ideas over and over again, so perhaps it doesn’t matter so much whether anyone has read any earlier entries. Regarding the problem of alienating people who then decide to automatically reject my opinions, there is probably nothing I can do, so I hope to go ahead and post more on politics and other controversial topics in the coming year, especially considering that there is an important election coming up.
I also hope to post some sort of “about” or “profile” page. I have mixed emotions about this. I like the blog entries to speak for themselves, but I feel a sort of vague obligation to post SOMETHING about myself.
This brings up the fact that after a full year, what I refer to as “this blog” continues to exist as basically two twin blogs, with mostly identical postings. The two blog hosts have different formats, though, so some things will be different, including the “about” or “profile” page. In particular, one format encourages creating specific “pages” -- I may post a “quotations” page or a “basic truths” page on that blog, but I am not sure how I will go about posting that information in the blog that does not have specific “pages”.
I continue to wonder whether I will run out of things to write about. So far, I generally have much more in mind to write about than I have time to write, and I struggle with figuring out which topics to choose, and whether the order of topics makes a difference in clarity and understanding. I feel a bit ... well, I suppose the best word is “guilty” ... about the fact that I tend to write things well in advance of actually posting them online. It lacks a certain spontaneity, but it gives me a chance to reconsider my words, and especially to try and arrange my ideas in a certain logical order.
Perhaps I should focus more upon the fact that there is a very real question about whether most of my blog entries will ever be read by anyone else, so most of my concerns are totally academic.
Truth is complicated.
Thursday, August 12, 2010
Beyond songs and guitars
At first glance, my previous blog entry, “Concerning songs and guitars”, would only seem relevant to someone with at least a rudimentary knowledge of how to play a guitar. Slightly deeper examination, however, of the blog entry -- which was itself a reply to another person’s blog entry -- and the associated “comments”, reveal several more general concepts. (Note: The "comments" are visible only by following the above link to the alternate version of this blog.)
First, there is the overwhelming importance of how each person defines individual words. In this case, my entire disagreement with the original blog post was based upon using a different definition of the word “song” than the blogger, Wheat Williams, had in mind. If I accept and use HIS definition, then we no longer have a disagreement, except perhaps over the definition, which is a completely different disagreement. I suspect wars have begun over disagreements that were actually just cases of differing definitions.
Then there are cases of personal taste and differing priorities. For example, I very much enjoy the SOUND of standard “open” chords -- what Wheat Williams refers to as “the ‘cliches’ of open chord voicings” -- while he specifically tunes his guitar in such a way that they cannot be played. Even if you do not know what an "open" guitar chord IS, you can understand the conflict between one person LIKING a particular sound, and another person avoiding it. There is no particular “right” or “wrong” here, regardless of how much someone likes or dislikes some particular aspect of guitar playing.
Finally, there is my spelling of “bar” chords, which Wheat Williams corrects to “barre”. I freely admit that I do not possess the expertise to debate anyone on this point. My focus has always been on PLAYING the guitar, and not so much on SPEAKING about it, and almost no focus at all on WRITING/SPELLING about it. It may be that “barre” is absolutely correct and “bar” is absolutely incorrect. Still, for those who may not even know what a barre/bar chord IS, I quote a few sources from the World Wide Web:
From YourDictionary.com, at http://www.yourdictionary.com/barre, there is:
barre (bär) also sp. bar; and
barre also bar (bär); which, incidentally, they define as “A fingering technique used with fretted stringed instruments in which a finger is laid across the fretboard to stop all or several strings at once.”
From Answers.com, at http://www.answers.com/topic/barre-chord, there is:
Barre chords (also known as bar chords, but more commonly spelled as "barre")
Timothy Woods, in a Suite101 article at
http://guitar.suite101.com/article.cfm/learn-how-to-play-bar-chords-on-guitar
“The bar chord (or barre chord, as it is also spelled)”
Harvey Reeves, in an ezine article at http://ezinearticles.com/?Guitar-Bar-Chords---The-Secret-of-How-to-Play-E---Shape-Bar-Chord&id=1814115
“I've been looking at a lot of advice on how to play bar chords, or barre chords (they're the same thing)”
Chip Evans, in the Acoustic Guitar Forum at http://www.acousticguitarforum.com/forums/archive/index.php/t-59043.html
“during my 35 years of teaching I have never heard the term "barre" used as much as it is on this forum. If english is your language the term "bar chord" is ok to use, in fact, they are easier to play :)”
But my favorite online barre/bar comment comes from Guitar for Beginners, at http://www.guitarforbeginners.com/barre.html
“Barre chords, or to use the less pretentious spelling, bar chords, are the reason the guitar is so cool.”
My POINT in listing these six online sources is NOT to debate the issue of which spelling is correct. Anyone can put anything on the Internet, and “barre” may still be absolutely correct and “bar” may still be absolutely incorrect. On an intellectual/philosophical level, the thing I enjoy most about the barre/bar “debate” is that it is NOT a debate between those who believe “barre” is correct and those who believe “bar” is correct. It is a debate between those who believe it MATTERS which spelling you use, and those who believe the two spellings are pretty much interchangeable. Such debates are tricky. It is usually easier to debate which of two positions is correct, rather than to debate whether the issue is significant -- and sometimes these two separate debates get confused with each other.
Still, I do not know enough about “barre” chords to know even whether the issue is significant. At this point, I am torn between just trusting Wheat Williams, who seems like a knowledgeable fellow, and sticking with my initial uneducated opinion that “barre” seems a bit pretentious.
I want to thank Wheat Williams for his original post, and his excellent blog, and for graciously contributing to THIS blog.
Truth is complicated.
First, there is the overwhelming importance of how each person defines individual words. In this case, my entire disagreement with the original blog post was based upon using a different definition of the word “song” than the blogger, Wheat Williams, had in mind. If I accept and use HIS definition, then we no longer have a disagreement, except perhaps over the definition, which is a completely different disagreement. I suspect wars have begun over disagreements that were actually just cases of differing definitions.
Then there are cases of personal taste and differing priorities. For example, I very much enjoy the SOUND of standard “open” chords -- what Wheat Williams refers to as “the ‘cliches’ of open chord voicings” -- while he specifically tunes his guitar in such a way that they cannot be played. Even if you do not know what an "open" guitar chord IS, you can understand the conflict between one person LIKING a particular sound, and another person avoiding it. There is no particular “right” or “wrong” here, regardless of how much someone likes or dislikes some particular aspect of guitar playing.
Finally, there is my spelling of “bar” chords, which Wheat Williams corrects to “barre”. I freely admit that I do not possess the expertise to debate anyone on this point. My focus has always been on PLAYING the guitar, and not so much on SPEAKING about it, and almost no focus at all on WRITING/SPELLING about it. It may be that “barre” is absolutely correct and “bar” is absolutely incorrect. Still, for those who may not even know what a barre/bar chord IS, I quote a few sources from the World Wide Web:
From YourDictionary.com, at http://www.yourdictionary.com/barre, there is:
barre (bär) also sp. bar; and
barre also bar (bär); which, incidentally, they define as “A fingering technique used with fretted stringed instruments in which a finger is laid across the fretboard to stop all or several strings at once.”
From Answers.com, at http://www.answers.com/topic/barre-chord, there is:
Barre chords (also known as bar chords, but more commonly spelled as "barre")
Timothy Woods, in a Suite101 article at
http://guitar.suite101.com/article.cfm/learn-how-to-play-bar-chords-on-guitar
“The bar chord (or barre chord, as it is also spelled)”
Harvey Reeves, in an ezine article at http://ezinearticles.com/?Guitar-Bar-Chords---The-Secret-of-How-to-Play-E---Shape-Bar-Chord&id=1814115
“I've been looking at a lot of advice on how to play bar chords, or barre chords (they're the same thing)”
Chip Evans, in the Acoustic Guitar Forum at http://www.acousticguitarforum.com/forums/archive/index.php/t-59043.html
“during my 35 years of teaching I have never heard the term "barre" used as much as it is on this forum. If english is your language the term "bar chord" is ok to use, in fact, they are easier to play :)”
But my favorite online barre/bar comment comes from Guitar for Beginners, at http://www.guitarforbeginners.com/barre.html
“Barre chords, or to use the less pretentious spelling, bar chords, are the reason the guitar is so cool.”
My POINT in listing these six online sources is NOT to debate the issue of which spelling is correct. Anyone can put anything on the Internet, and “barre” may still be absolutely correct and “bar” may still be absolutely incorrect. On an intellectual/philosophical level, the thing I enjoy most about the barre/bar “debate” is that it is NOT a debate between those who believe “barre” is correct and those who believe “bar” is correct. It is a debate between those who believe it MATTERS which spelling you use, and those who believe the two spellings are pretty much interchangeable. Such debates are tricky. It is usually easier to debate which of two positions is correct, rather than to debate whether the issue is significant -- and sometimes these two separate debates get confused with each other.
Still, I do not know enough about “barre” chords to know even whether the issue is significant. At this point, I am torn between just trusting Wheat Williams, who seems like a knowledgeable fellow, and sticking with my initial uneducated opinion that “barre” seems a bit pretentious.
I want to thank Wheat Williams for his original post, and his excellent blog, and for graciously contributing to THIS blog.
Truth is complicated.
Friday, July 30, 2010
Concerning songs and guitars
Among the many things I do not fully understand about blogging, there is the entire area of “comments” or replies. As far as I can determine, I have only had one real, legitimate, comment posted so far in my blog -- the blogging service has automatically filtered out others considered “spam”.
Among the murky issues, there is copyright. If I write it, whether it is posted on my blog or on someone else’s, it is automatically copyrighted to ME, just as anything YOU write but post on MY blog is copyrighted to YOU. This is a major reason many blogs frown on anonymous comments -- since anything YOU write remains YOUR intellectual property, the blog owner needs some way to trace ownership of the comment back to YOU.
The other day I ran across an interesting blog post which has inspired me to post a comment on that blog. To keep the ownership issues clear, I have decided to post my comment first here, on my own blog, before I post it as a comment elsewhere. Perhaps this is stupid or otherwise inappropriate. I continue to learn by doing.
The original blog post was entitled “How Rock Ruined Songs.” The blogger writes eloquently, and any attempt on my part to summarize his post will not do it justice, but he basically argues that since rock guitar parts tend to be played in specific keys, this limits the vocalist to singing in those keys, rather than freely transposing into keys more comfortable for the individual vocalist. As I said, this is an oversimplification, and if you are interested, you should refer back to the original post at http://wheatwilliams.com/wordpress/2009/09/18/how-rock-ruined-songs/.
Here is my response:
You make many interesting points, and I cannot disagree with your conclusions -- provided that you define “song” as a melody-driven primarily vocal endeavor, with other instruments relegated to supporting the vocalist or vocalists.
For me, it is more complicated. When I listen to music, whether live, recorded, or broadcast, I tend to view the vocalist(s) as simply one of the various instruments -- often a solo instrument, but not necessarily much more important than the others. Furthermore, I am increasingly convinced that the overall appeal and “sound” of any given piece of music depends more on the underlying chord structure than on the melody itself.
After voice, my formal musical training was piano and then trumpet, with guitar coming third among my main instruments. I perform most regularly on trumpet, in a brass quintet, a symphony orchestra, two brass bands and a variety of other groups. Incidentally, my favorite style for trumpet playing is jazz. My favorite style for guitar playing is folk.
While I am not sure about the proper musical term for this, one of my favorite things about the guitar is its irregularity. That is, due to the traditional tunings of the six strings, the chords used to accompany a song in the key of G sound completely different than the chords used to accompany a song in the key of A. On the piano or on the trumpet, I can play a song in either G or A, and the difference between the two keys will be mostly just the difference in pitch. On guitar, EVERYTHING changes -- or, at least, in my humble opinion, everything SHOULD change. I have little respect for those guitarists who eschew open chords and instead focus mostly on bar chords, making the keys sound interchangeable.
When I listen to a recording of a piano or a trumpet, it is a relatively straightforward matter to roughly duplicate what I hear. With a guitar, it can be an incredible challenge discerning how the guitarist used the particular combination of fingers and strings to achieve the sounds -- especially if the guitarist used a capo or non-standard tunings. Again, this is one of the things I love about the guitar, and, in my mind, what sets it apart from other instruments.
Incidentally, when I sing along with my guitar, I make extensive use of a capo to try and match the chords and fingerings of other guitarists, or simply to get the “sound” I am striving for, while suiting my own unique voice. I also accept that there are certain songs I simply cannot perform the way I would like to perform them, since even with a capo they fall outside of my vocal range.
I am not saying that a guitarist is obligated to copy the notes or keys of some other guitarist, although that is an option, and sometimes the best option. (Sometimes another instrument will attempt to duplicate the notes of a guitar riff -- this is one way to handle performing a guitar-based song in a completely different key.) The guitarist is simply obligated to attempt to contribute to each unique piece of music using their unique abilities and the unique characteristics of the instrument -- which they should embrace, rather than struggle against. Not every song is suited for every instrument or every instrumentalist or every vocalist, and there is no reason why they SHOULD be.
Finally, while I am not a music historian, it seems to me that “from the dawn of time” most early instruments capable of varied pitches tended to have a limited range and a limited number of available keys, so the vocalists would have to accommodate THEM. The development of instruments capable of easily performing in all keys came later ...
Still, it all comes down to your definition of “song.”
Among the murky issues, there is copyright. If I write it, whether it is posted on my blog or on someone else’s, it is automatically copyrighted to ME, just as anything YOU write but post on MY blog is copyrighted to YOU. This is a major reason many blogs frown on anonymous comments -- since anything YOU write remains YOUR intellectual property, the blog owner needs some way to trace ownership of the comment back to YOU.
The other day I ran across an interesting blog post which has inspired me to post a comment on that blog. To keep the ownership issues clear, I have decided to post my comment first here, on my own blog, before I post it as a comment elsewhere. Perhaps this is stupid or otherwise inappropriate. I continue to learn by doing.
The original blog post was entitled “How Rock Ruined Songs.” The blogger writes eloquently, and any attempt on my part to summarize his post will not do it justice, but he basically argues that since rock guitar parts tend to be played in specific keys, this limits the vocalist to singing in those keys, rather than freely transposing into keys more comfortable for the individual vocalist. As I said, this is an oversimplification, and if you are interested, you should refer back to the original post at http://wheatwilliams.com/wordpress/2009/09/18/how-rock-ruined-songs/.
Here is my response:
You make many interesting points, and I cannot disagree with your conclusions -- provided that you define “song” as a melody-driven primarily vocal endeavor, with other instruments relegated to supporting the vocalist or vocalists.
For me, it is more complicated. When I listen to music, whether live, recorded, or broadcast, I tend to view the vocalist(s) as simply one of the various instruments -- often a solo instrument, but not necessarily much more important than the others. Furthermore, I am increasingly convinced that the overall appeal and “sound” of any given piece of music depends more on the underlying chord structure than on the melody itself.
After voice, my formal musical training was piano and then trumpet, with guitar coming third among my main instruments. I perform most regularly on trumpet, in a brass quintet, a symphony orchestra, two brass bands and a variety of other groups. Incidentally, my favorite style for trumpet playing is jazz. My favorite style for guitar playing is folk.
While I am not sure about the proper musical term for this, one of my favorite things about the guitar is its irregularity. That is, due to the traditional tunings of the six strings, the chords used to accompany a song in the key of G sound completely different than the chords used to accompany a song in the key of A. On the piano or on the trumpet, I can play a song in either G or A, and the difference between the two keys will be mostly just the difference in pitch. On guitar, EVERYTHING changes -- or, at least, in my humble opinion, everything SHOULD change. I have little respect for those guitarists who eschew open chords and instead focus mostly on bar chords, making the keys sound interchangeable.
When I listen to a recording of a piano or a trumpet, it is a relatively straightforward matter to roughly duplicate what I hear. With a guitar, it can be an incredible challenge discerning how the guitarist used the particular combination of fingers and strings to achieve the sounds -- especially if the guitarist used a capo or non-standard tunings. Again, this is one of the things I love about the guitar, and, in my mind, what sets it apart from other instruments.
Incidentally, when I sing along with my guitar, I make extensive use of a capo to try and match the chords and fingerings of other guitarists, or simply to get the “sound” I am striving for, while suiting my own unique voice. I also accept that there are certain songs I simply cannot perform the way I would like to perform them, since even with a capo they fall outside of my vocal range.
I am not saying that a guitarist is obligated to copy the notes or keys of some other guitarist, although that is an option, and sometimes the best option. (Sometimes another instrument will attempt to duplicate the notes of a guitar riff -- this is one way to handle performing a guitar-based song in a completely different key.) The guitarist is simply obligated to attempt to contribute to each unique piece of music using their unique abilities and the unique characteristics of the instrument -- which they should embrace, rather than struggle against. Not every song is suited for every instrument or every instrumentalist or every vocalist, and there is no reason why they SHOULD be.
Finally, while I am not a music historian, it seems to me that “from the dawn of time” most early instruments capable of varied pitches tended to have a limited range and a limited number of available keys, so the vocalists would have to accommodate THEM. The development of instruments capable of easily performing in all keys came later ...
Still, it all comes down to your definition of “song.”
Thursday, July 15, 2010
Pets and Me
I did not particularly grow up with pets. When I was very young, my family had a dog -- a Cocker Spaniel -- who lived out in a pen in the back yard. I would sometimes go up to the fence, and the dog would climb up on the fence to face me, and as far as I was concerned the dog and I had done a trick. I do not recall any of the other children in the family -- all older than me -- ever having much to do with the dog. We gave her away while I was still quite young.
When I was a bit older, about third grade, we had a lamb who lived in the dog pen. I would actually play some with the lamb, and sometimes gave her her bottle when she was young, and food when she was older. Eventually the lamb sort of grew up, and we took her back to the farm she came from.
The first pet that was truly "mine" was a tiger salamander that one of my dad's co-workers caught in his yard. I took care of my salamander and played with him and was quite fond of him. After less than a year, he escaped inside the house (he normally lived on the back porch, but they were spraying for mosquitoes so I brought him inside). My dad found his shriveled body months later, and I cried.
I had various other short-term "pets", especially orphaned and injured birds, and frogs. Most died, but a few were released alive or at least escaped to fates unknown. In general, my family had a rule against "pets in the house" though an exception was made for my "sea monkeys". As with most people who have had "sea monkeys", I was not particularly impressed with them as pets, and I do not recall for certain what became of them, but I am pretty sure they died. I also interacted with neighbor pets, and formed occasional short-term attachments to animals encountered away from home. I should mention the curious case of a robin who bonded with our family, but I will leave that for another time.
After my older brother had moved away from home, gotten married, and acquired a pair of Golden Retrievers, I would sometimes be asked to keep them for a week or two. They would reside in the same pen as our original Cocker Spaniel (and later lamb). I like to think those were pleasant times for both the dogs and me. Since I only had them for relatively brief spans of time, while I did have them I could devote most of my time and energy to their care and recreation, putting other things on hold until they were gone. Mostly we would go for walks. LONG walks. They were both very good, likable dogs, and easy to take care of. Eventually I went off to college, and the arrangement ended.
For several decades, my life was almost pet-free, though there were still relationships with neighbor pets and occasional short-term attachments. Then life began to take some convoluted turns. In the space of a few years, I first formed close relationships with several pets technically belonging to a close friend, then suddenly had my own dog, then my own cat, then more cats, more dogs, and the occasional mouse, not to mention my crippled squirrel.
Now I care for literally more animals than I can count, since I am involved with a variety of stray or wild animals who come and go and exist on the edge of my perception, animals cared for also by friends, and those animals who are one hundred percent "mine". In recent years, I have never specifically sought out a pet, and I have never purchased a pet or answered an ad for a pet or adopted a pet from a shelter -- though a couple of my pets have spent time in shelters before eventually working their way to me. All my animals are strays or cast-offs or animals for some reason needing a home. (In the case of mice, I sometimes catch house mice in live traps in the dead of winter and care for them until warm weather when I attempt to relocate them.)
(I am aware that people have a full range of emotions about pets, just as they have a full range of emotions about most topics. When hearing about the numbers of animals I keep company with, a common first response is to question whether I have had them "fixed". The short answer is "Of course." Among my largest personal expenses is pet-neutering.)
The point of this introductory discussion is just to begin to explain where I am coming from in regard to pets. Though I grew up largely without animals, they are now a huge part of my life. As I type this, there are two nine-week-old orphaned, bottle-fed kittens asleep on my lap, a somewhat older kitten asleep on the desk next to the computer monitor, and yet another cat grooming himself on the shelf six inches from my shoulder.
LATER: It's been awhile since I first typed this, but I never got around to posting it. NOW there is yet another cat curled up in front of the monitor, and every now and then he hits the power button and the monitor goes off.
I COULD try and discuss the role of pets in our lives, or our relationship with the rest of the animal kingdom, but I will leave all that for another time.
Truth is complicated.
When I was a bit older, about third grade, we had a lamb who lived in the dog pen. I would actually play some with the lamb, and sometimes gave her her bottle when she was young, and food when she was older. Eventually the lamb sort of grew up, and we took her back to the farm she came from.
The first pet that was truly "mine" was a tiger salamander that one of my dad's co-workers caught in his yard. I took care of my salamander and played with him and was quite fond of him. After less than a year, he escaped inside the house (he normally lived on the back porch, but they were spraying for mosquitoes so I brought him inside). My dad found his shriveled body months later, and I cried.
I had various other short-term "pets", especially orphaned and injured birds, and frogs. Most died, but a few were released alive or at least escaped to fates unknown. In general, my family had a rule against "pets in the house" though an exception was made for my "sea monkeys". As with most people who have had "sea monkeys", I was not particularly impressed with them as pets, and I do not recall for certain what became of them, but I am pretty sure they died. I also interacted with neighbor pets, and formed occasional short-term attachments to animals encountered away from home. I should mention the curious case of a robin who bonded with our family, but I will leave that for another time.
After my older brother had moved away from home, gotten married, and acquired a pair of Golden Retrievers, I would sometimes be asked to keep them for a week or two. They would reside in the same pen as our original Cocker Spaniel (and later lamb). I like to think those were pleasant times for both the dogs and me. Since I only had them for relatively brief spans of time, while I did have them I could devote most of my time and energy to their care and recreation, putting other things on hold until they were gone. Mostly we would go for walks. LONG walks. They were both very good, likable dogs, and easy to take care of. Eventually I went off to college, and the arrangement ended.
For several decades, my life was almost pet-free, though there were still relationships with neighbor pets and occasional short-term attachments. Then life began to take some convoluted turns. In the space of a few years, I first formed close relationships with several pets technically belonging to a close friend, then suddenly had my own dog, then my own cat, then more cats, more dogs, and the occasional mouse, not to mention my crippled squirrel.
Now I care for literally more animals than I can count, since I am involved with a variety of stray or wild animals who come and go and exist on the edge of my perception, animals cared for also by friends, and those animals who are one hundred percent "mine". In recent years, I have never specifically sought out a pet, and I have never purchased a pet or answered an ad for a pet or adopted a pet from a shelter -- though a couple of my pets have spent time in shelters before eventually working their way to me. All my animals are strays or cast-offs or animals for some reason needing a home. (In the case of mice, I sometimes catch house mice in live traps in the dead of winter and care for them until warm weather when I attempt to relocate them.)
(I am aware that people have a full range of emotions about pets, just as they have a full range of emotions about most topics. When hearing about the numbers of animals I keep company with, a common first response is to question whether I have had them "fixed". The short answer is "Of course." Among my largest personal expenses is pet-neutering.)
The point of this introductory discussion is just to begin to explain where I am coming from in regard to pets. Though I grew up largely without animals, they are now a huge part of my life. As I type this, there are two nine-week-old orphaned, bottle-fed kittens asleep on my lap, a somewhat older kitten asleep on the desk next to the computer monitor, and yet another cat grooming himself on the shelf six inches from my shoulder.
LATER: It's been awhile since I first typed this, but I never got around to posting it. NOW there is yet another cat curled up in front of the monitor, and every now and then he hits the power button and the monitor goes off.
I COULD try and discuss the role of pets in our lives, or our relationship with the rest of the animal kingdom, but I will leave all that for another time.
Truth is complicated.
Thursday, June 24, 2010
Different
One morning in high school, the teacher introduced an "exchange student" from Columbia. She sat right in front of me, in the first seat of a row, so I automatically became her first friend. Everything about her was "different." She did not speak English very well, but she also had a "different" voice, different ways of doing things, different mannerisms, and even LOOKED different with her clothing and the way she wore her hair and makeup. The fact that so much of our common ordinary world was "new" to her was fascinating, and there was also a certain amount of notoriety associated with being her friend. At first, being around her was exhilarating ... but that was not to last. Over time, characteristics that had seemed so charming became simply annoying. As her circle of friends expanded, we drifted apart.
I suspect that there are some people who have an automatic tendency to unconsciously view "different" as "good", while others may have an equally-automatic unconscious tendency to view "different" as "bad". Imagine being offered a plate of totally-unfamiliar exotic foreign food. Some will consider this food "good" until proven otherwise, while some will consider it "bad". Some may hold onto their pre-existing bias even after tasting the food, insisting that unpalatable food is "good" strictly BECAUSE it is different, or labelling the food "bad" no matter how tasty it is. Some may taste the food with no pre-existing expectations, but I would guess this is the smallest of the three groups.
These biases apply to all areas of life. I cannot say whether the same people who like "different" food like "different" music, or whether the same people who dislike "different" cars dislike "different" people. I suspect so. It would be interesting to study.
As with our exchange student, it is easy to get caught up in the emotion of accepting something "different", just as it is easy to get caught up in the emotion of condemning something "different". As with other biases that effect our ability to see clearly, we cannot completely overcome these tendencies, but being aware of them can reduce their impact. In reality, there is no way to generalize the value of "different". Sometimes different is good, sometimes different is bad, sometimes different is just different -- though, to be thorough, sometimes different for the sake of different can be good, and sometimes different for the sake of different can be bad ... or sometimes just different.
Truth is complicated.
I suspect that there are some people who have an automatic tendency to unconsciously view "different" as "good", while others may have an equally-automatic unconscious tendency to view "different" as "bad". Imagine being offered a plate of totally-unfamiliar exotic foreign food. Some will consider this food "good" until proven otherwise, while some will consider it "bad". Some may hold onto their pre-existing bias even after tasting the food, insisting that unpalatable food is "good" strictly BECAUSE it is different, or labelling the food "bad" no matter how tasty it is. Some may taste the food with no pre-existing expectations, but I would guess this is the smallest of the three groups.
These biases apply to all areas of life. I cannot say whether the same people who like "different" food like "different" music, or whether the same people who dislike "different" cars dislike "different" people. I suspect so. It would be interesting to study.
As with our exchange student, it is easy to get caught up in the emotion of accepting something "different", just as it is easy to get caught up in the emotion of condemning something "different". As with other biases that effect our ability to see clearly, we cannot completely overcome these tendencies, but being aware of them can reduce their impact. In reality, there is no way to generalize the value of "different". Sometimes different is good, sometimes different is bad, sometimes different is just different -- though, to be thorough, sometimes different for the sake of different can be good, and sometimes different for the sake of different can be bad ... or sometimes just different.
Truth is complicated.
Thursday, June 17, 2010
New
Many people have an understandable tendency to want "new" things. New clothes, new car, new house (maybe even new spouse, but that is beyond the scope of this discussion). There are also those who tend to stick with the "old", but I suspect those who favor the "new" are a much larger group.
"New" has different meanings and different contexts. Primarily there is the question of "brand new" or "newly constructed" versus "new to you". With some things, like cars, our society tends to clearly differentiate between "new" -- cars that have had no previous owners -- and "used" -- even if the car was only driven a few miles by the previous owner. With something like a house, the distinction is not quite so absolute. If, for instance, a house was specifically constructed for another family, but they only lived there one day before selling it, it might be referred to as "new". In fact, ANY newly-acquired house might be referred to as "new". The distinction between "new" and "used" is more often applied to things smaller than houses.
A couple of special cases are "antiques" (and the closely related term "vintage") and "factory refurbished". I believe "factory refurbished" has been in common usage for only the last few decades. With "factory refurbished" items, the seller intentionally blurs the distinction between "new" and "used", often suggesting the item might have been previously owned but never used by the previous owner. Much of the appeal of "antiques" and "vintage" items comes from the idea that they ARE old and used, but sometimes unscrupulous people will attempt to pass off "new" or "newer" items as "antiques".
The real merits of "new" versus "used" vary with the specifics of the item. Sadly, there is often a trade-off. "New" items may have improvements upon older versions, but some of the improvements may involve using poorer materials to hold down the price, or short-cuts in the manufacturing process. Changes in materials, such as substituting plastic for metal, may improve a product by reducing weight, while at the same time making the product less durable. It all depends on the situation, and there is no way to accurately state whether "new" is "better" or "worse" in all situations.
This brings us back to the idea of personal bias. If someone already believes that "new" is "better", they may be unlikely to change their opinion even when confronted with ironclad facts, and the same is true of someone who believes that "old" is "better". Both groups may be able to cite factors to support their point of view.
I confess to a personal bias AGAINST things that are "new" -- partly just to oppose what I perceive to be an unfair general bias favoring the "new". Though I acknowledge that I am biased, in many cases I am fairly certain that the "old" possesses definite advantages. As a couple of examples, I cite roll-up non-electric automobile windows (which are more reliable and trouble-free than power windows, and usable even when the vehicle is totally without electrical power) and my Windows98 laptop computer, which automatically saves streaming audio as MP3 files, and makes it much easier to monitor any attempts by malware to alter the system than later versions of Windows. With both of these examples, I realize that the issue is complicated. There ARE advantages to automobile power windows, such as the driver being able to operate a window beyond their own reach. Later versions of Windows make various improvements upon Windows98. As I stated earlier, there is often a trade-off when it comes to "new" items.
The problems with "new" are especially prevalent in the world of technology, where products evolve rapidly. I recently considered replacing my digital camera. To be honest, I was considering this only because I was having mechanical problems with my camera. The new-camera salesman pointed out that I could purchase a technically-superior state-of-the-art camera for much less than I had originally paid for my camera. When I pointed out that none of the cameras I was being shown possessed optical viewfinders, the salesman replied that they were no longer included on cameras in my price range. Though optical viewfinders are still included on some cameras, they all cost hundreds of dollars more than my original camera. Somewhere, someone has decided that optical viewfinders are unimportant, or at least not as important as other features, such as a larger screen. I could write many pages regarding the fact that I consider optical viewfinders to be one of a digital camera's most important features. (By the way, Canon ended up repairing my camera for free.)
Especially with high-tech products, there is also the issue of familiarity -- though this issue is not limited to the world of technology. In addition to the simple comfort that familiarity can provide, there is often a learning curve associated with new products that negates their improvements, at least until the user becomes familiar with the new product. I must admit that the skill of the product developers comes into play here: When I eventually acquire a "new" digital camera, the ease of my transition from old to new will depend to a great extent on the design of the new camera.
Sadly, this is one of those issues where it may be difficult to find common ground between those who favor the "old" and those who favor the "new". If someone truly favors the "old", an attempt to incorporate "old" features into something "new" may seem like nothing more than an attempt to quiet their objections. I recently heard someone arguing in favor of tearing down an old building and replacing it with a brand-new one, and he suggested those who favored keeping the old building might be satisfied by embedding a few stones from the original building into a wall in the new building. At the same time, someone who truly believes "new" is better will not be satisfied by even the most thorough "updating" of the "old".
This brings up the fact that sometimes there are simply issues that divide people, with no clear compromises and no real solutions.
Truth is complicated.
"New" has different meanings and different contexts. Primarily there is the question of "brand new" or "newly constructed" versus "new to you". With some things, like cars, our society tends to clearly differentiate between "new" -- cars that have had no previous owners -- and "used" -- even if the car was only driven a few miles by the previous owner. With something like a house, the distinction is not quite so absolute. If, for instance, a house was specifically constructed for another family, but they only lived there one day before selling it, it might be referred to as "new". In fact, ANY newly-acquired house might be referred to as "new". The distinction between "new" and "used" is more often applied to things smaller than houses.
A couple of special cases are "antiques" (and the closely related term "vintage") and "factory refurbished". I believe "factory refurbished" has been in common usage for only the last few decades. With "factory refurbished" items, the seller intentionally blurs the distinction between "new" and "used", often suggesting the item might have been previously owned but never used by the previous owner. Much of the appeal of "antiques" and "vintage" items comes from the idea that they ARE old and used, but sometimes unscrupulous people will attempt to pass off "new" or "newer" items as "antiques".
The real merits of "new" versus "used" vary with the specifics of the item. Sadly, there is often a trade-off. "New" items may have improvements upon older versions, but some of the improvements may involve using poorer materials to hold down the price, or short-cuts in the manufacturing process. Changes in materials, such as substituting plastic for metal, may improve a product by reducing weight, while at the same time making the product less durable. It all depends on the situation, and there is no way to accurately state whether "new" is "better" or "worse" in all situations.
This brings us back to the idea of personal bias. If someone already believes that "new" is "better", they may be unlikely to change their opinion even when confronted with ironclad facts, and the same is true of someone who believes that "old" is "better". Both groups may be able to cite factors to support their point of view.
I confess to a personal bias AGAINST things that are "new" -- partly just to oppose what I perceive to be an unfair general bias favoring the "new". Though I acknowledge that I am biased, in many cases I am fairly certain that the "old" possesses definite advantages. As a couple of examples, I cite roll-up non-electric automobile windows (which are more reliable and trouble-free than power windows, and usable even when the vehicle is totally without electrical power) and my Windows98 laptop computer, which automatically saves streaming audio as MP3 files, and makes it much easier to monitor any attempts by malware to alter the system than later versions of Windows. With both of these examples, I realize that the issue is complicated. There ARE advantages to automobile power windows, such as the driver being able to operate a window beyond their own reach. Later versions of Windows make various improvements upon Windows98. As I stated earlier, there is often a trade-off when it comes to "new" items.
The problems with "new" are especially prevalent in the world of technology, where products evolve rapidly. I recently considered replacing my digital camera. To be honest, I was considering this only because I was having mechanical problems with my camera. The new-camera salesman pointed out that I could purchase a technically-superior state-of-the-art camera for much less than I had originally paid for my camera. When I pointed out that none of the cameras I was being shown possessed optical viewfinders, the salesman replied that they were no longer included on cameras in my price range. Though optical viewfinders are still included on some cameras, they all cost hundreds of dollars more than my original camera. Somewhere, someone has decided that optical viewfinders are unimportant, or at least not as important as other features, such as a larger screen. I could write many pages regarding the fact that I consider optical viewfinders to be one of a digital camera's most important features. (By the way, Canon ended up repairing my camera for free.)
Especially with high-tech products, there is also the issue of familiarity -- though this issue is not limited to the world of technology. In addition to the simple comfort that familiarity can provide, there is often a learning curve associated with new products that negates their improvements, at least until the user becomes familiar with the new product. I must admit that the skill of the product developers comes into play here: When I eventually acquire a "new" digital camera, the ease of my transition from old to new will depend to a great extent on the design of the new camera.
Sadly, this is one of those issues where it may be difficult to find common ground between those who favor the "old" and those who favor the "new". If someone truly favors the "old", an attempt to incorporate "old" features into something "new" may seem like nothing more than an attempt to quiet their objections. I recently heard someone arguing in favor of tearing down an old building and replacing it with a brand-new one, and he suggested those who favored keeping the old building might be satisfied by embedding a few stones from the original building into a wall in the new building. At the same time, someone who truly believes "new" is better will not be satisfied by even the most thorough "updating" of the "old".
This brings up the fact that sometimes there are simply issues that divide people, with no clear compromises and no real solutions.
Truth is complicated.
Tuesday, June 8, 2010
Truth vs. Blame
Philosophers and theologians devote considerable energy to the subject of unfortunate events, and the question of why bad things happen. Sometimes, an event seems totally outside of human control or influence, and we can only question fate or God or other intangibles. Other times, people can entertain the idea that humans caused or at least could have had an impact on the unfortunate event -- whether or not this is actually the case. I might go so far as to suggest that many people seek to find specific humans to blame for tragedies, and somehow find comfort in the idea that the event was NOT a totally random act of providence.
When such a question arises in a hospital setting -- that is, when there is an unfortunate event that may have been influenced by humans -- many hospitals examine the event in the setting of a "Morbidity/Mortality" conference. While I suppose that each hospital may have their own variations on such conferences, a common, key feature of the "Morbidity/Mortality" conference is that it be totally, one hundred percent, guaranteed confidential, and in no way open to the public. The reason for this feature is that the underlying purpose of the conference is to seek the truth of what happened, and especially to determine whether anything can be done to improve the situation or to keep it from happening again. This becomes much more difficult, if not impossible, if we are at the same time attempting to assign blame or guilt. As long as no one fears being blamed for their actions, ideally each person can come forward with anything they may have done or not done which may have in any way contributed to the outcome of the event.
Now let's imagine that a beloved family member died while in the hospital, and you suspect someone on the hospital staff made an error that led to the death. It is predictable and understandable that you might want to assign blame, and even prosecute the "guilty" staff member. However, your family member is already dead, and there is nothing that can be done that will bring them back. If their death was indeed caused by an error on the part of a hospital staffer, the only thing that can be done that will actually preserve human health and wellness is to figure out what happened, and try to prevent it from happening again. This will be much more difficult if hospital staffers who may have done nothing wrong are afraid to step forward for fear of being found "guilty" of something. Ideally, everyone involved will want to seek the truth, and anyone who DID contribute to the negative outcome will fully come forward, but this is deeply contrary to human nature, especially if we only entertain vague suspicions that somehow we ourselves may have contributed negatively.
Theoretically, the processes of finding truth and assigning blame are not mutually exclusive. In fact, when full truth is known, then blame may perhaps be accurately assigned. The problem is that finding the full truth -- especially when that truth is complicated -- is best accomplished with honesty, open communication, and unbiased full revelation and examination of the facts, which is made difficult if not impossible by the attempt to assign or deflect blame, or to establish or deflect guilt.
Sometimes it is enough to assign blame or establish guilt without fully discerning the total truth of the situation. In the case of criminal activity, such as robbery or murder, probably the most important thing is to establish who is guilty, and punish them, without ever fully comprehending the subtle, complicated truths of the event in question. Other times, I question the underlying goals of a course of inquiry, and what it is we are truly trying to accomplish. I admit that my own bias often comes down in favor of finding truth rather than assigning blame.
One of the best examples of what I am discussing is the US government's "commission" and "hearings" to examine the terrorist attacks of September 11, 2001. I am generally quite proud to be an American. I cannot recall a time I was LESS proud to be an American than when observing the actions of the nine-eleven commission, which quickly degenerated into an attempt to assign blame, at the expense of ever finding the full truth. This was a case in which the two goals -- truth vs. blame -- were clearly mutually exclusive. Perhaps I am wrong. Perhaps in this case, assigning blame WAS more important than finding truth, but that is difficult for me to accept.
The subject of truth vs. blame involves a complicated relationship between the past and the future. For me, it often comes down to a question of focusing on the unchangeable past or the unfolding future. While I fully agree with the idea that "Those who cannot remember the past are condemned to repeat it" (most commonly credited to George Santayana), those who choose to focus primarily on assigning blame are often willingly sacrificing their potential more full understanding of the past, limiting the knowledge with which they face future.
Another, more current, example involves the recent oil spill in the Gulf of Mexico. As oil continues to gush into the ocean, with technicians on scene and experts throughout the world working for a way to stop the ongoing catastrophe, the US government is discussing the possibility of criminal prosecutions for wrongdoing. This may score some points politically, especially with those demanding that the government "do something" or "do more", but it is difficult to comprehend how it will help solve this or future problems. It is easy to comprehend the likelihood that criminal prosecutions, while possibly "sending a message" to future oil-drillers, will have the effect of obscuring the truth of what has happened, and making an unfortunate repeat of the event much more likely.
Truth is complicated. I certainly do not claim that one way is always "right" and the other way is always "wrong". I am merely stating that there is often a profound but overlooked difference between finding truth and establishing blame. When you or your loved ones have been wronged, it is fully understandable that your priority may be to punish the "guilty", and I cannot argue with that. Sometimes, though, we should consider the possibility that the most important thing may be finding the truth, and in order to do this we may have to sacrifice blame and punishment.
When such a question arises in a hospital setting -- that is, when there is an unfortunate event that may have been influenced by humans -- many hospitals examine the event in the setting of a "Morbidity/Mortality" conference. While I suppose that each hospital may have their own variations on such conferences, a common, key feature of the "Morbidity/Mortality" conference is that it be totally, one hundred percent, guaranteed confidential, and in no way open to the public. The reason for this feature is that the underlying purpose of the conference is to seek the truth of what happened, and especially to determine whether anything can be done to improve the situation or to keep it from happening again. This becomes much more difficult, if not impossible, if we are at the same time attempting to assign blame or guilt. As long as no one fears being blamed for their actions, ideally each person can come forward with anything they may have done or not done which may have in any way contributed to the outcome of the event.
Now let's imagine that a beloved family member died while in the hospital, and you suspect someone on the hospital staff made an error that led to the death. It is predictable and understandable that you might want to assign blame, and even prosecute the "guilty" staff member. However, your family member is already dead, and there is nothing that can be done that will bring them back. If their death was indeed caused by an error on the part of a hospital staffer, the only thing that can be done that will actually preserve human health and wellness is to figure out what happened, and try to prevent it from happening again. This will be much more difficult if hospital staffers who may have done nothing wrong are afraid to step forward for fear of being found "guilty" of something. Ideally, everyone involved will want to seek the truth, and anyone who DID contribute to the negative outcome will fully come forward, but this is deeply contrary to human nature, especially if we only entertain vague suspicions that somehow we ourselves may have contributed negatively.
Theoretically, the processes of finding truth and assigning blame are not mutually exclusive. In fact, when full truth is known, then blame may perhaps be accurately assigned. The problem is that finding the full truth -- especially when that truth is complicated -- is best accomplished with honesty, open communication, and unbiased full revelation and examination of the facts, which is made difficult if not impossible by the attempt to assign or deflect blame, or to establish or deflect guilt.
Sometimes it is enough to assign blame or establish guilt without fully discerning the total truth of the situation. In the case of criminal activity, such as robbery or murder, probably the most important thing is to establish who is guilty, and punish them, without ever fully comprehending the subtle, complicated truths of the event in question. Other times, I question the underlying goals of a course of inquiry, and what it is we are truly trying to accomplish. I admit that my own bias often comes down in favor of finding truth rather than assigning blame.
One of the best examples of what I am discussing is the US government's "commission" and "hearings" to examine the terrorist attacks of September 11, 2001. I am generally quite proud to be an American. I cannot recall a time I was LESS proud to be an American than when observing the actions of the nine-eleven commission, which quickly degenerated into an attempt to assign blame, at the expense of ever finding the full truth. This was a case in which the two goals -- truth vs. blame -- were clearly mutually exclusive. Perhaps I am wrong. Perhaps in this case, assigning blame WAS more important than finding truth, but that is difficult for me to accept.
The subject of truth vs. blame involves a complicated relationship between the past and the future. For me, it often comes down to a question of focusing on the unchangeable past or the unfolding future. While I fully agree with the idea that "Those who cannot remember the past are condemned to repeat it" (most commonly credited to George Santayana), those who choose to focus primarily on assigning blame are often willingly sacrificing their potential more full understanding of the past, limiting the knowledge with which they face future.
Another, more current, example involves the recent oil spill in the Gulf of Mexico. As oil continues to gush into the ocean, with technicians on scene and experts throughout the world working for a way to stop the ongoing catastrophe, the US government is discussing the possibility of criminal prosecutions for wrongdoing. This may score some points politically, especially with those demanding that the government "do something" or "do more", but it is difficult to comprehend how it will help solve this or future problems. It is easy to comprehend the likelihood that criminal prosecutions, while possibly "sending a message" to future oil-drillers, will have the effect of obscuring the truth of what has happened, and making an unfortunate repeat of the event much more likely.
Truth is complicated. I certainly do not claim that one way is always "right" and the other way is always "wrong". I am merely stating that there is often a profound but overlooked difference between finding truth and establishing blame. When you or your loved ones have been wronged, it is fully understandable that your priority may be to punish the "guilty", and I cannot argue with that. Sometimes, though, we should consider the possibility that the most important thing may be finding the truth, and in order to do this we may have to sacrifice blame and punishment.
Wednesday, May 19, 2010
Categories and Labels
I find the subject of "thought" and "learning" to be fascinating, but I have no particular expertise on the topic. One of the things that fascinates me about thought is that we are not necessarily aware of our own complicated thought processes. I wonder, for instance, to what extent language influences our thought processes. When someone is raised without a spoken or written language, does some sort of internal "language" just develop spontaneously in their own brain?
Keeping in mind that I claim no particular expertise on the subject of thought, I would guess that one of the most basic aspects of thought and learning is placing things into categories, comparing one thing with another thing. For example, each new object a creature encounters is evaluated for whether or not it presents an immediate threat or danger. Another example is the way babies amuse us by their frequent difficulty distinguishing "food" from "non-food".
Every day of our lives, we encounter objects or situations that we have never encountered before -- at least not the IDENTICAL object or situation. One of the ways that we cope with this potentially-overwhelming onslaught of new information is by recognizing the similarities between different things, and unconsciously assigning categories and labels. If I am walking along a sidewalk and encounter a garden hose, it does not particularly matter to me whether I have ever seen that identical garden hose before. I have encountered garden hoses before, and I immediately label it a "garden hose" and place it in the same mental category as other garden hoses. Unless it is being brandished in a menacing manner, I do not view it as a threat. Though it bears certain similarities to a snake, I do NOT place it in the same category as "snakes". I MAY place it in a similar category to the child's toy I encounter nearby ("stuff someone left out in their yard"), even though it bears little physical resemblance to the toy.
I can place the garden hose in an infinite number of potential categories, such as "hoses", "things I might trip over", "yard care implements", "sources of water", "round", "hardware store items" ... the number of potential categories truly is infinite. A crucial, oft-overlooked point is that the categories are something I am imposing on the garden hose, consciously or unconsciously. They are not attributes of the garden hose, though a garden hose can be categorized based on its attributes. I may place the garden hose in the category "made of rubber" when in fact it is made of plastic, and this error on my part may or may not be significant, but it has no effect on the make-up of the garden hose.
The categories that we use depend on a complicated blend of ourselves -- our own backgrounds and priorities, among other things -- and the particular situation. If we are looking for something to rescue someone fallen down a well, then rope, cable, and hose may all fall in the same category. "Things to siphon gasoline" is a completely different category, but overlaps on "hose".
It is one thing to categorize inanimate objects. Categorizing activities, relationships, or animate objects grows much more complicated. Among the additional difficulties is that these things may be constantly changing. The relationship between two people, for example, varies from moment to moment, and is constantly evolving. It may be accurate enough to state that people are "friends" or "spouses", but if we try to apply more specific categories, we may be in for trouble.
This brings up the separate but related issue of definitions and terminology. There are well-known issues with the terms "fruit" versus "vegetable". Many things that are scientifically considered "fruits" are often categorized as "vegetables". If I agree to bring a "fruit salad" to a picnic, and I show up with a blend of tomato, squash, and cucumber, I may cause concern among those expecting apples, oranges, and bananas. In this case, I would be scientifically correct, but probably out of the mainstream.
"Mainstream" categories probably never precisely match our personal categories, just as our personal definitions of words do not precisely match dictionary definitions. This can cause conflicts when our categories differ in crucial ways from those around us. The person who asked me to bring a "fruit salad" to the picnic may be very disappointed with my choice of fruits.
Categories and labels are necessary and helpful while also being dangerous and obstructive. Once we label something, we are mentally assigning attributes to this thing that it may not possess, while potentially overlooking or denying attributes that it DOES possess. Even if we categorize it correctly, and assign only the correct attributes, by the very act of placing it in a certain category, we are shaping how we will view that thing.
In this "digital" era, we try to reduce infinite variation into finite variation. There are virtually infinite variations of what humans perceive as "color." To "digitize" color, we represent each color by a different number. The more numbers we use, the more variations of color we can represent. For some purposes, it is enough to have eight or nine colors. Blue is blue, red is red. Real life, though, suffers no such restrictions on the number of observable colors.
In effect, we now try to "digitize" everything. Uncomfortable with infinite variation, we seek a limited number of alternatives. We place things in arbitrarily-defined categories based on our own previous experiences, and then grow agitated if they do not seem to precisely "fit" these categories. Many a musician has fallen out of favor for performing music outside of their "category." A "country" musician can be criticized for not being "country" enough. Several famous musicians are credited with making the observation, "There are only two kinds of music, good music and bad music." Music, and so much of our universe, defies being confined into precise categories.
Categories and labels are strictly ways of looking at things. There is no "best" way to look at something, though certain ways may be most effective or appropriate for certain situations. While there is no "right" category or label, they can be "wrong", though sometimes, as in the case of fruits and vegetables, even "right" and "wrong" depend on the specifics of the situation.
Truth is complicated.
Keeping in mind that I claim no particular expertise on the subject of thought, I would guess that one of the most basic aspects of thought and learning is placing things into categories, comparing one thing with another thing. For example, each new object a creature encounters is evaluated for whether or not it presents an immediate threat or danger. Another example is the way babies amuse us by their frequent difficulty distinguishing "food" from "non-food".
Every day of our lives, we encounter objects or situations that we have never encountered before -- at least not the IDENTICAL object or situation. One of the ways that we cope with this potentially-overwhelming onslaught of new information is by recognizing the similarities between different things, and unconsciously assigning categories and labels. If I am walking along a sidewalk and encounter a garden hose, it does not particularly matter to me whether I have ever seen that identical garden hose before. I have encountered garden hoses before, and I immediately label it a "garden hose" and place it in the same mental category as other garden hoses. Unless it is being brandished in a menacing manner, I do not view it as a threat. Though it bears certain similarities to a snake, I do NOT place it in the same category as "snakes". I MAY place it in a similar category to the child's toy I encounter nearby ("stuff someone left out in their yard"), even though it bears little physical resemblance to the toy.
I can place the garden hose in an infinite number of potential categories, such as "hoses", "things I might trip over", "yard care implements", "sources of water", "round", "hardware store items" ... the number of potential categories truly is infinite. A crucial, oft-overlooked point is that the categories are something I am imposing on the garden hose, consciously or unconsciously. They are not attributes of the garden hose, though a garden hose can be categorized based on its attributes. I may place the garden hose in the category "made of rubber" when in fact it is made of plastic, and this error on my part may or may not be significant, but it has no effect on the make-up of the garden hose.
The categories that we use depend on a complicated blend of ourselves -- our own backgrounds and priorities, among other things -- and the particular situation. If we are looking for something to rescue someone fallen down a well, then rope, cable, and hose may all fall in the same category. "Things to siphon gasoline" is a completely different category, but overlaps on "hose".
It is one thing to categorize inanimate objects. Categorizing activities, relationships, or animate objects grows much more complicated. Among the additional difficulties is that these things may be constantly changing. The relationship between two people, for example, varies from moment to moment, and is constantly evolving. It may be accurate enough to state that people are "friends" or "spouses", but if we try to apply more specific categories, we may be in for trouble.
This brings up the separate but related issue of definitions and terminology. There are well-known issues with the terms "fruit" versus "vegetable". Many things that are scientifically considered "fruits" are often categorized as "vegetables". If I agree to bring a "fruit salad" to a picnic, and I show up with a blend of tomato, squash, and cucumber, I may cause concern among those expecting apples, oranges, and bananas. In this case, I would be scientifically correct, but probably out of the mainstream.
"Mainstream" categories probably never precisely match our personal categories, just as our personal definitions of words do not precisely match dictionary definitions. This can cause conflicts when our categories differ in crucial ways from those around us. The person who asked me to bring a "fruit salad" to the picnic may be very disappointed with my choice of fruits.
Categories and labels are necessary and helpful while also being dangerous and obstructive. Once we label something, we are mentally assigning attributes to this thing that it may not possess, while potentially overlooking or denying attributes that it DOES possess. Even if we categorize it correctly, and assign only the correct attributes, by the very act of placing it in a certain category, we are shaping how we will view that thing.
In this "digital" era, we try to reduce infinite variation into finite variation. There are virtually infinite variations of what humans perceive as "color." To "digitize" color, we represent each color by a different number. The more numbers we use, the more variations of color we can represent. For some purposes, it is enough to have eight or nine colors. Blue is blue, red is red. Real life, though, suffers no such restrictions on the number of observable colors.
In effect, we now try to "digitize" everything. Uncomfortable with infinite variation, we seek a limited number of alternatives. We place things in arbitrarily-defined categories based on our own previous experiences, and then grow agitated if they do not seem to precisely "fit" these categories. Many a musician has fallen out of favor for performing music outside of their "category." A "country" musician can be criticized for not being "country" enough. Several famous musicians are credited with making the observation, "There are only two kinds of music, good music and bad music." Music, and so much of our universe, defies being confined into precise categories.
Categories and labels are strictly ways of looking at things. There is no "best" way to look at something, though certain ways may be most effective or appropriate for certain situations. While there is no "right" category or label, they can be "wrong", though sometimes, as in the case of fruits and vegetables, even "right" and "wrong" depend on the specifics of the situation.
Truth is complicated.
Wednesday, May 12, 2010
Damages
I recently read an article on the World Wide Web regarding an incident which has led to a lawsuit. ALL I know about the incident is what I read in the article, and I do not know any of the people involved, whether the article is accurate, or whether perhaps it is all fiction. Regardless of the details or truth of the article, it raises some profound issues.
According to the article, a twelve-year-old was writing on her school desktop -- that is, writing directly on the desk (the article calls it "doodling") -- "with erasable green marker" and as a consequence was forcibly taken to "the dean's office" and searched before the police were summoned to arrest her. The police handcuffed her, took her to the precinct headquarters and detained her there, "handcuffed to a pole for more than two hours."
The article does not say exactly what happened next, but apparently she was released, eventually suspended from school, and "given eight hours of community service and ordered to write a book report and an essay about what she learned from the experience."
Later, "New York City officials acknowledged Gonzalez's arrest was a mistake." Now, the student and her mother are "suing the New York City Education Department and the New York Police Department for $1 million in damages, claiming excessive use of force and violation of the girl's rights in the ordeal, which Comacho has called a "nightmare.""
As I have already stated, ALL of my information comes from this one article, and I have no reason to believe it is totally accurate, but for the purpose of this discussion I will consider the entire situation hypothetical, and therefore exactly as I have just described it. Even in this oversimplified form, the issue is complicated.
Thoughtful, rational, intelligent people might have a full range of reactions to this incident. Some would say the student got what she deserved, or possibly should have been punished more severely. Probably more would agree that the authorities overreacted. Some might argue that the student deserves even more monetary compensation than one million dollars.
I am honestly not sure where I stand on this issue, but I am deeply troubled about the source of the million dollars sought by the student and her mother. Under our present system, any money that they receive is unlikely to come directly from any of the individuals involved. IF the money comes from "the New York City Education Department and the New York Police Department" then it is actually coming from taxpayers and the various sources of funding for those two departments. More likely, the money would come from insurance policies, in which case it is coming from a wider pool, and in a sense, from ALL of us. Either way, the money would be coming from a large number of individuals uninvolved with the case. One can argue that this is an abstraction, or irrelevant, but it remains true.
No matter how badly the student was mistreated by school officials and/or police, it is hard for me to accept the idea that she now deserves a million dollars from the rest of us -- and the money would indeed be coming from the rest of us. If instead the student and her mother were demanding that individuals involved in the case be reprimanded, fired, or personally punished in some other way, I could not argue -- but instead they are asking for a big chunk of cash from the rest of us.
There are those who would argue that the student has been deeply scarred by the experience, and for that I am willing to offer her an apology from all of society -- but not one million dollars. (I do not really see how a million dollars is going to un-scar her, though perhaps it would pay for therapy, if she chose to use it for that purpose.) Others will argue that the amount needs to be that large to "send a message"; to make the individuals involved reflect upon their misdeeds. I am unconvinced that this is effective, since the money is actually coming from a vast pool of individuals ... BUT, if this is truly the case, then perhaps I would be willing to have a million dollar fine imposed, but paid to some worthy charity or perhaps used to reduce the national debt. This would still "send a message" but it would not personally enrich the student and her mother at MY expense.
Finally, there are those who would argue that the money is coming from SUCH a vast pool that the effect on me personally is insignificant, and therefore I have no right to object. In a world with finite and dwindling resources, and many people in hardship, I cannot accept this idea. Even pennies add up, and millions add up more quickly.
This is the way I view most large monetary damage awards. For those who are truly deserving of a large financial settlement, I suggest we set up a system where anyone who wishes to contribute can do so. This student and her mother might end up with considerably MORE than a million dollars, which would be fine by me, as long as the money is contributed voluntarily. I am willing to offer sympathy, and I have no problem with punishing those who wronged you, but unless you can make the case that YOUR loss entitles you to MY money, keep your hands out of MY pockets!
According to the article, a twelve-year-old was writing on her school desktop -- that is, writing directly on the desk (the article calls it "doodling") -- "with erasable green marker" and as a consequence was forcibly taken to "the dean's office" and searched before the police were summoned to arrest her. The police handcuffed her, took her to the precinct headquarters and detained her there, "handcuffed to a pole for more than two hours."
The article does not say exactly what happened next, but apparently she was released, eventually suspended from school, and "given eight hours of community service and ordered to write a book report and an essay about what she learned from the experience."
Later, "New York City officials acknowledged Gonzalez's arrest was a mistake." Now, the student and her mother are "suing the New York City Education Department and the New York Police Department for $1 million in damages, claiming excessive use of force and violation of the girl's rights in the ordeal, which Comacho has called a "nightmare.""
As I have already stated, ALL of my information comes from this one article, and I have no reason to believe it is totally accurate, but for the purpose of this discussion I will consider the entire situation hypothetical, and therefore exactly as I have just described it. Even in this oversimplified form, the issue is complicated.
Thoughtful, rational, intelligent people might have a full range of reactions to this incident. Some would say the student got what she deserved, or possibly should have been punished more severely. Probably more would agree that the authorities overreacted. Some might argue that the student deserves even more monetary compensation than one million dollars.
I am honestly not sure where I stand on this issue, but I am deeply troubled about the source of the million dollars sought by the student and her mother. Under our present system, any money that they receive is unlikely to come directly from any of the individuals involved. IF the money comes from "the New York City Education Department and the New York Police Department" then it is actually coming from taxpayers and the various sources of funding for those two departments. More likely, the money would come from insurance policies, in which case it is coming from a wider pool, and in a sense, from ALL of us. Either way, the money would be coming from a large number of individuals uninvolved with the case. One can argue that this is an abstraction, or irrelevant, but it remains true.
No matter how badly the student was mistreated by school officials and/or police, it is hard for me to accept the idea that she now deserves a million dollars from the rest of us -- and the money would indeed be coming from the rest of us. If instead the student and her mother were demanding that individuals involved in the case be reprimanded, fired, or personally punished in some other way, I could not argue -- but instead they are asking for a big chunk of cash from the rest of us.
There are those who would argue that the student has been deeply scarred by the experience, and for that I am willing to offer her an apology from all of society -- but not one million dollars. (I do not really see how a million dollars is going to un-scar her, though perhaps it would pay for therapy, if she chose to use it for that purpose.) Others will argue that the amount needs to be that large to "send a message"; to make the individuals involved reflect upon their misdeeds. I am unconvinced that this is effective, since the money is actually coming from a vast pool of individuals ... BUT, if this is truly the case, then perhaps I would be willing to have a million dollar fine imposed, but paid to some worthy charity or perhaps used to reduce the national debt. This would still "send a message" but it would not personally enrich the student and her mother at MY expense.
Finally, there are those who would argue that the money is coming from SUCH a vast pool that the effect on me personally is insignificant, and therefore I have no right to object. In a world with finite and dwindling resources, and many people in hardship, I cannot accept this idea. Even pennies add up, and millions add up more quickly.
This is the way I view most large monetary damage awards. For those who are truly deserving of a large financial settlement, I suggest we set up a system where anyone who wishes to contribute can do so. This student and her mother might end up with considerably MORE than a million dollars, which would be fine by me, as long as the money is contributed voluntarily. I am willing to offer sympathy, and I have no problem with punishing those who wronged you, but unless you can make the case that YOUR loss entitles you to MY money, keep your hands out of MY pockets!
Wednesday, May 5, 2010
Inches
In previous entries I have mentioned sports metaphors, while acknowledging my overall lack of skill and knowledge regarding most sports.
One of my favorite sports metaphors is "a game of inches", which I am pretty sure refers to football. Basically (and this is a huge oversimplification) the entire object of football is for one team to move the ball one hundred yards, against the other team which is striving to keep them from accomplishing this. Although the football field is over one hundred yards long, sometimes the teams find themselves in a situation in which a few inches will make the difference between success and failure. Officials may stop the game to take measurements to determine which team has been successful.
I am gradually becoming more and more convinced that much of life is "a game of inches" -- that the difference between happiness and unhappiness often comes down to inches rather than miles, or seconds rather than years, or pennies rather than dollars. Depending on how you look at it, this can be an uncomfortable philosophy.
Once in high school track, I lost a two-mile footrace -- in that case, a race lasting over eleven minutes -- by a fraction of a second. For the rest of my life, I have been haunted by the idea that if I had done something slightly differently, or tried slightly harder, I could have won that race. In some ways, it would be easier to believe that my opponent was simply faster, or having a better day, and I could not possibly have won -- but that does not seem to be the case.
When we fail to achieve a goal, it is often tempting to believe that the odds were somehow stacked against us and failure was almost certain, rather than that we came ever-so-close to succeeding, especially if the lack of success involves some tiny failing on our part.
This is perhaps even more true when it involves our effect on other people, and their effect on us. In an episode of the fictional sitcom "Becker", Becker asks to borrow a number two pencil from another student shortly after beginning an important written exam. The student, distracted by the request, miss-marks most of his answer sheet, fails the exam, and, as a direct consequence, is not admitted to medical school, and follows a different career path for the rest of his life. You can argue that the student SHOULD have been more careful, or SHOULD have re-taken the exam in the future, or somehow risen above the obstacle Becker placed in his path, but the (fictional) fact remains that if Becker had NOT asked for the pencil, the student's life would have unfolded in a different direction.
The "Becker" episode is a fictional example, but illustrates something that can and does happen in real life, probably more often than we can ever know. It is not comfortable to realize that we can dramatically alter either our own lives or the lives of others by something as minor as asking to borrow a pencil, and I am sure that many people would voice strong objections to the idea. To be fair, the Becker example is somewhat negative, while a small thing might also have a positive effect.
It is tempting to give credit or blame to "luck" or "fate" in the cases of small things having large impacts. Most collisions between automobiles would not have occurred if either vehicle had been at a slightly different location or at a slightly different time or under other slightly different circumstances. The fact that they did occur was "bad luck". At the same time, many collisions between automobiles involve predictable problems and preventable risks -- a drunk or otherwise distracted driver, or perhaps someone running late and speeding because someone had asked them for a number two pencil.
Real life is complicated. In real life, everything that happens involves a web of an infinite number of factors. Sometimes we can give credit or blame to one obvious factor, such as a particular act of valor changing the course of a battle. More often, success or failure depends on a multitude of factors both large and small, and changing any one of them might change the outcome. Without a time machine, we can never be certain what effect any change would have produced (though we can have hypothetical discussions, run computer simulations, and otherwise engage in speculation).
I could give other examples of seemingly insignificant things having large effects, but, as I have stated, this is an uncomfortable idea, and will always produce a certain amount of resistance. Especially, it is uncomfortable to accept that something very small we have or have not done has had a large negative effect on ourselves or someone else. In the case of someone else, we refuse to acknowledge that their problem is our fault -- they should have been able to overcome the tiny problem we threw at them. In the case of ourselves, it seems preferable to blame outside forces, or overwhelming forces.
I realize that I am simultaneously arguing that we are reluctant to admit the large negative impact we may have on others, while arguing that many of our own problems that we blame on others are in fact due to small failings on our own part. This may seem inconsistent. Furthermore, I am largely ignoring the small things that have a large positive impact.
Truth is complicated.
One of my favorite sports metaphors is "a game of inches", which I am pretty sure refers to football. Basically (and this is a huge oversimplification) the entire object of football is for one team to move the ball one hundred yards, against the other team which is striving to keep them from accomplishing this. Although the football field is over one hundred yards long, sometimes the teams find themselves in a situation in which a few inches will make the difference between success and failure. Officials may stop the game to take measurements to determine which team has been successful.
I am gradually becoming more and more convinced that much of life is "a game of inches" -- that the difference between happiness and unhappiness often comes down to inches rather than miles, or seconds rather than years, or pennies rather than dollars. Depending on how you look at it, this can be an uncomfortable philosophy.
Once in high school track, I lost a two-mile footrace -- in that case, a race lasting over eleven minutes -- by a fraction of a second. For the rest of my life, I have been haunted by the idea that if I had done something slightly differently, or tried slightly harder, I could have won that race. In some ways, it would be easier to believe that my opponent was simply faster, or having a better day, and I could not possibly have won -- but that does not seem to be the case.
When we fail to achieve a goal, it is often tempting to believe that the odds were somehow stacked against us and failure was almost certain, rather than that we came ever-so-close to succeeding, especially if the lack of success involves some tiny failing on our part.
This is perhaps even more true when it involves our effect on other people, and their effect on us. In an episode of the fictional sitcom "Becker", Becker asks to borrow a number two pencil from another student shortly after beginning an important written exam. The student, distracted by the request, miss-marks most of his answer sheet, fails the exam, and, as a direct consequence, is not admitted to medical school, and follows a different career path for the rest of his life. You can argue that the student SHOULD have been more careful, or SHOULD have re-taken the exam in the future, or somehow risen above the obstacle Becker placed in his path, but the (fictional) fact remains that if Becker had NOT asked for the pencil, the student's life would have unfolded in a different direction.
The "Becker" episode is a fictional example, but illustrates something that can and does happen in real life, probably more often than we can ever know. It is not comfortable to realize that we can dramatically alter either our own lives or the lives of others by something as minor as asking to borrow a pencil, and I am sure that many people would voice strong objections to the idea. To be fair, the Becker example is somewhat negative, while a small thing might also have a positive effect.
It is tempting to give credit or blame to "luck" or "fate" in the cases of small things having large impacts. Most collisions between automobiles would not have occurred if either vehicle had been at a slightly different location or at a slightly different time or under other slightly different circumstances. The fact that they did occur was "bad luck". At the same time, many collisions between automobiles involve predictable problems and preventable risks -- a drunk or otherwise distracted driver, or perhaps someone running late and speeding because someone had asked them for a number two pencil.
Real life is complicated. In real life, everything that happens involves a web of an infinite number of factors. Sometimes we can give credit or blame to one obvious factor, such as a particular act of valor changing the course of a battle. More often, success or failure depends on a multitude of factors both large and small, and changing any one of them might change the outcome. Without a time machine, we can never be certain what effect any change would have produced (though we can have hypothetical discussions, run computer simulations, and otherwise engage in speculation).
I could give other examples of seemingly insignificant things having large effects, but, as I have stated, this is an uncomfortable idea, and will always produce a certain amount of resistance. Especially, it is uncomfortable to accept that something very small we have or have not done has had a large negative effect on ourselves or someone else. In the case of someone else, we refuse to acknowledge that their problem is our fault -- they should have been able to overcome the tiny problem we threw at them. In the case of ourselves, it seems preferable to blame outside forces, or overwhelming forces.
I realize that I am simultaneously arguing that we are reluctant to admit the large negative impact we may have on others, while arguing that many of our own problems that we blame on others are in fact due to small failings on our own part. This may seem inconsistent. Furthermore, I am largely ignoring the small things that have a large positive impact.
Truth is complicated.