Maybe. I'm not sure. If I hadn't been taking the Writing for the Web course at BCIT, I don't think I would have started another blog, let alone one on technology. But I don't think I'll delete it, and maybe I'll even add to it. I was surprised at how easily this came to me.
It took me a long time to embrace my geekiness. While I was working on technical trade publications, I thought of myself as a musician with a day job, because playing in a rock band was what I really loved to do. When I was first working in software development, I thought of myself as an actor with a day job, because I had shifted into theatre and loved it. When I immigrated to Canada, I did so as an experienced software developer. I didn't tell them that I was really going to Vancouver to work in the film industry. As it turned out, even though I got plenty of work as an extra, I made more money as a technical writer.
I started to do software development again in 2000. Before long, with my music and acting more behind than ahead of me, I realized that I was fine doing software. I learned. I got better at it. And I found that in a lot of ways it was a creative outlet for me.
I guess it was really a new century. I started to admit that I was a software developer, not an aspiring actor or musician. This still surprises people sometimes, because even now there aren't that many women who work as software designers and developers.
Perhaps that's why it's important to me that people know there are other sides to me. If you've visited my "real" blog, Fromage, which I surreptitiously linked to over there on the right, you'll know what I'm passionate about. I write about fashion, food, music, politics and feminism. Sure, I'm a geek, but I'm also a fashionista, dammit! And I play guitar and even drums when I make time to practise! And I make really tasty food!
The passions that aren't revenue-neutral, however, tend to be revenue-negative and sometimes downright expensive. They are habits that need to be fed. So I continue to work in high tech.
It's not just a job, though. I like what I do, and I'm interested in how technology is evolving and changing all our lives. I'm old-fashioned in a lot of ways, but I'm not a Luddite. I've very attached to my laptop and my high-speed internet connection. They allow me to put my words out there for anyone to see! The fact that there are millions and millions of us doing the same thing doesn't make it any less cool.
Saturday, October 8, 2011
Friday, October 7, 2011
market economy
Blog was originally short for "web log." I wonder how many people remember that. The idea was to keep a log, or journal, or diary, but to do so in public. It was on the web, so other people could read it. Some wrote about whatever was happening in their lives. Some wrote about specific topics. But now, I doubt that any of the "my boring life" blogs will get a large readership, unless the person is very funny or witty—which sometimes happens. Seinfeld was, after all, a show about nothing, but it was some of the funniest nothing ever.
Twitter started in a similar way. People tweeted their every mundane action in 140 or fewer characters. "Just had lunch at Bungie Burger. Yum!" It wasn't long before people tired of that sort of thing. Anyone who tweets things like that any more is unlikely to have many followers. Maybe even many friends!
Both blogging and tweeting have evolved into something else, at least for those who want it: ways to create your own personal brand. It used to be that a brand was only something owned by a corporation. Now, we have the democratization of branding. Bloggers and tweeters are no longer just people. They are their own product, and they engage in their own marketing.
It was happening right from the start, but usually in a passive way. We built it, and sometimes they came, thanks to search engines and word of mouth. But now, the competition for eyeballs is more active. We share our blogs on Facebook. We set them up to post a URL automatically to Twitter. We set up links to and from other blogs. We comment on other blogs and leave a link behind when we do.
I think a lot of this has to do with the extreme competition for employment and contracts. We no longer approach an interview with our degree and work experience, hoping to be a good fit. It has become almost a company-to-company negotiation. We have strengths, assets, abilities. We show how our brand can help their brand—and how it can do so better than all those other personal brands out there can.
Many of us now need a web presence as much as a corporation does. We market ourselves actively. And thus we must be careful about our web presence. We might need to compartmentalize our lives, or simply keep the fun (yet embarrassing) stuff off the web entirely. The plus side of living in public is that your accomplishments and abilities can be well known. The downside is that you might be highly visible, warts and all.
The better part of valour is discretion, wrote the Bard. If you wake up with a hangover and can't remember what you did the night before except you're pretty sure there were police involved, that might not be the best subject for a blog post or a tweet, or even your "private" Facebook wall. On the web, it's easy to overshare. It's very difficult to take it back.
Addendum: which is better—no photo or (one hopes) a good photo?
Twitter started in a similar way. People tweeted their every mundane action in 140 or fewer characters. "Just had lunch at Bungie Burger. Yum!" It wasn't long before people tired of that sort of thing. Anyone who tweets things like that any more is unlikely to have many followers. Maybe even many friends!
Both blogging and tweeting have evolved into something else, at least for those who want it: ways to create your own personal brand. It used to be that a brand was only something owned by a corporation. Now, we have the democratization of branding. Bloggers and tweeters are no longer just people. They are their own product, and they engage in their own marketing.
It was happening right from the start, but usually in a passive way. We built it, and sometimes they came, thanks to search engines and word of mouth. But now, the competition for eyeballs is more active. We share our blogs on Facebook. We set them up to post a URL automatically to Twitter. We set up links to and from other blogs. We comment on other blogs and leave a link behind when we do.
I think a lot of this has to do with the extreme competition for employment and contracts. We no longer approach an interview with our degree and work experience, hoping to be a good fit. It has become almost a company-to-company negotiation. We have strengths, assets, abilities. We show how our brand can help their brand—and how it can do so better than all those other personal brands out there can.
Many of us now need a web presence as much as a corporation does. We market ourselves actively. And thus we must be careful about our web presence. We might need to compartmentalize our lives, or simply keep the fun (yet embarrassing) stuff off the web entirely. The plus side of living in public is that your accomplishments and abilities can be well known. The downside is that you might be highly visible, warts and all.
The better part of valour is discretion, wrote the Bard. If you wake up with a hangover and can't remember what you did the night before except you're pretty sure there were police involved, that might not be the best subject for a blog post or a tweet, or even your "private" Facebook wall. On the web, it's easy to overshare. It's very difficult to take it back.
Addendum: which is better—no photo or (one hopes) a good photo?
Saturday, October 1, 2011
snap, crackle, pop
A comment that was left on my words on a page post reminded me of another major change in technology: audio recording. The commentator remarked that reading a Kindle versus reading a book was a little like listening to an MP3 versus listening to a CD. And I thought, what about any digital recording versus a phonograph record or tape?
The tape might then be used to master a phonograph record, in which the sound wave analogue is a pattern of slight irregularities in a groove of vinyl. And finally, either the tape magnetizes a playback head and produces an electrical signal, or the stylus of a phonograph vibrates according to the record groove and produces an electrical signal. That signal is amplified, and the strengthened signal causes a speaker to vibrate and reproduce, with greater or lesser fidelity, the original sound.
At each stage of an audio recording, there is something of the original sound wave intact. The electrical signal resembles the sound wave. The magnetic pattern resembles the electrical signal. The grooves in the record resemble the magnetic pattern. The electrical signal resembles the vibration of the grooves. The vibration of the speaker resembles the electrical signal. And finally, the sound waves produced by the speaker resemble the vibration—and every previous form of the wave back to the original. The trick in analogue audio recording is always to capture as much of the original wave as possible, losing as little as possible and adding as little as possible (distortion) in the process of recording and reproducing the sound.
It's remarkable what this technological descendent of Thomas Edison's invention—speaking into a cone, which caused a needle to vibrate and trace a pattern in a wax cylinder—can produce. The best analogue audio recording and reproduction equipment results in a sound that is remarkably faithful to the original, including its dynamic range, its frequency range, and its harmonics and overtones.
Of course, most of us can't afford the finest reproduction equipment—things like a super-quiet turntable, wooden tonearm, multiradial stylus, low-distortion amplifier, and speakers accurate from 20 to 20,000 Hertz.
Sound information is incredibly complex. Take just one instrument, such as a cello. The sound it makes can be loud or soft or in between. It has a wide frequency range. And the frequency range includes not only the actual notes played but the harmonics and overtones that are also generated by bow and fingers on strings amplified by a wooden casing. There are also more subtleties, such as how each note starts and ends and whether a note sounds rich or thin.
In digital recording, all of this information has to be encoded in such a way that it can be reproduced faithfully. And it works, to an extent. For most of us, a compact disc sounds wonderfully clear and realistic. To an audiophile, CDs have a harshness that is not present in analogue recording, more apparent in orchestral music than in the rock and hip-hop and other popular music that many of us listen to. (I remember an all-digital recording of the solo piano version of Pictures at an Exhibition by Modest Mussorgsky that even to these non-audiophile ears was unlistenable.) Digital is an improvement in many ways over analogue, but in many other ways there is a loss of quality that is not dissimilar to the difference between reading from a Kindle and reading a book. It's a difference in quality more than in quantity.
The loss of quality is exacerbated in a file format such as MP3 versus CD. Encoding all that complexity takes one heck of a lot of ones and zeros. MP3s are usually encoded at lower sampling rate than is used for CDs, so that the files don't become even larger than they are. That means an MP3 lacks both dynamic and frequency range. It simply doesn't have enough information to do a great job of reproduction. We sacrifice quality for convenience, and most of us don't notice the difference.
Despite what I know, I have far more CDs than records. I have a turntable that is not set up. I listen to MP3s more often than anything else. Yet I understand the appeal of records and tapes. There was no mystery to it. You can understand how sound vibrations can be turned into something similar and then turned back into sound. I have no clue how music is digitally encoded.
I have some MP3s that I ripped from an old audio cassette that was recorded from a borrowed copy of Unknown Pleasures by Joy Division. Each pop is familiar. There is something oddly comforting about this imperfection.
Music preserved
The first kind of audio recording method was analogue recording. Every step in the process involved either vibrations or something analogous to vibrations. An orchestra plays. Sound waves strike the diaphragms of microphones. The microphones convert sound vibrations to their electrical analogue. The electrical signal magnifies a recording head, which arranges the pattern of metal oxide in yet another representation of the original sound waves.The tape might then be used to master a phonograph record, in which the sound wave analogue is a pattern of slight irregularities in a groove of vinyl. And finally, either the tape magnetizes a playback head and produces an electrical signal, or the stylus of a phonograph vibrates according to the record groove and produces an electrical signal. That signal is amplified, and the strengthened signal causes a speaker to vibrate and reproduce, with greater or lesser fidelity, the original sound.
At each stage of an audio recording, there is something of the original sound wave intact. The electrical signal resembles the sound wave. The magnetic pattern resembles the electrical signal. The grooves in the record resemble the magnetic pattern. The electrical signal resembles the vibration of the grooves. The vibration of the speaker resembles the electrical signal. And finally, the sound waves produced by the speaker resemble the vibration—and every previous form of the wave back to the original. The trick in analogue audio recording is always to capture as much of the original wave as possible, losing as little as possible and adding as little as possible (distortion) in the process of recording and reproducing the sound.
It's remarkable what this technological descendent of Thomas Edison's invention—speaking into a cone, which caused a needle to vibrate and trace a pattern in a wax cylinder—can produce. The best analogue audio recording and reproduction equipment results in a sound that is remarkably faithful to the original, including its dynamic range, its frequency range, and its harmonics and overtones.
Of course, most of us can't afford the finest reproduction equipment—things like a super-quiet turntable, wooden tonearm, multiradial stylus, low-distortion amplifier, and speakers accurate from 20 to 20,000 Hertz.
Music encoded
Digital recording does not produce analogues of the original sound wave. Instead, it encodes sound information into a stream of ones and zeros. What is remarkable is that this works at all and that a reasonable facsimile of what the New Pornographers sounded like in the studio comes out of our speakers or headphones—without any signal loss, hiss, hum, clicks, or other byproducts of analogue recording and reproduction. What is far less remarkable is the reality of how well this encoding actually works. Yes, you can hear the squeak in Ringo's bass drum pedal. But are you truly hearing what the Vienna Symphony Orchestra sounded like in the concert hall?Sound information is incredibly complex. Take just one instrument, such as a cello. The sound it makes can be loud or soft or in between. It has a wide frequency range. And the frequency range includes not only the actual notes played but the harmonics and overtones that are also generated by bow and fingers on strings amplified by a wooden casing. There are also more subtleties, such as how each note starts and ends and whether a note sounds rich or thin.
In digital recording, all of this information has to be encoded in such a way that it can be reproduced faithfully. And it works, to an extent. For most of us, a compact disc sounds wonderfully clear and realistic. To an audiophile, CDs have a harshness that is not present in analogue recording, more apparent in orchestral music than in the rock and hip-hop and other popular music that many of us listen to. (I remember an all-digital recording of the solo piano version of Pictures at an Exhibition by Modest Mussorgsky that even to these non-audiophile ears was unlistenable.) Digital is an improvement in many ways over analogue, but in many other ways there is a loss of quality that is not dissimilar to the difference between reading from a Kindle and reading a book. It's a difference in quality more than in quantity.
The loss of quality is exacerbated in a file format such as MP3 versus CD. Encoding all that complexity takes one heck of a lot of ones and zeros. MP3s are usually encoded at lower sampling rate than is used for CDs, so that the files don't become even larger than they are. That means an MP3 lacks both dynamic and frequency range. It simply doesn't have enough information to do a great job of reproduction. We sacrifice quality for convenience, and most of us don't notice the difference.
The warmth of imperfection
Analogue recording isn't simple, but there is an elegance to it. Waves becomes waves become waves and finally turn back into waves very much like the originals. Wax cylinders are still playable using very simple equipment. And in the words of a story I once heard, vinyl is final. It warps, is too easily scratched, and even breaks, but in many ways it is incredibly durable. You can still play great-grandpa's 78s. We now know that CDs aren't nearly as immutable as was once thought. When analogue media deteriorate, you can still make out a semblance of the original sound. The same is not true for encoded bits. They work or they don't.Despite what I know, I have far more CDs than records. I have a turntable that is not set up. I listen to MP3s more often than anything else. Yet I understand the appeal of records and tapes. There was no mystery to it. You can understand how sound vibrations can be turned into something similar and then turned back into sound. I have no clue how music is digitally encoded.
I have some MP3s that I ripped from an old audio cassette that was recorded from a borrowed copy of Unknown Pleasures by Joy Division. Each pop is familiar. There is something oddly comforting about this imperfection.
Subscribe to:
Posts (Atom)