Have you ever had a really bad day? Of course, that’s a ridiculous question. We’ve all had really bad days. Sometimes, the bad days truly are bad – some circumstances are just awful..things happen to all of us occasionally, such as the death of a loved one, losing your job, a falling out with a close friend. Fortunately, these events are less than common for most people. These kind of bad days, though terrible, are few and far between. So, if that’s the case, then why do people seem to have so many “bad” days? Why do so many people talk about how “stressful” and “challenging” and “irritating” and even “horrendous” their day has been? If nobody’s died, your job is still intact, your friends still love you….what is it that’s dragging you down? Most of the time, people’s bad days revolve around their work. What I’ve been observing lately is that people who are stressed and challenged and irritated are that way because of something that’s happend through the course of their work day. Ever notice how fewer people tend to have “bad” days on Saturday? Yah. You might respond to that question with “Well, not everyone likes their job.”. In fact, it’s true that some people downright hate their jobs. I can see how it would be difficult to have a good day in that situation. I’ve had jobs I hate. I know what that’s like. But here’s the thing. I believe that 99% of the “bad” days we have don’t really exist. Yup. 99%. (1% is reserved for catastrophic situations like natural disasters.) Why’s that? I’ll show you. It’s all in your head. Ever had a day when you wake up in the morning and your mind is immediately racing with all of the worries you have for the day? “What if my computer crashes during my big presentation?” , “What if the boss hates my designs?”, “What if nobody shows up for the conference?”, “What if that agent doesn’t call me back in time for tomorrow’s shoot?”. The next time you have worries about your day, write them down. Then, at the end of the day, go back and review your list. Odds are, most or all of the worries that woke you up with did not happen. You can’t predict the future, so stop trying. You can only control the now. Stop worrying so much. You’ll have far fewer bad days. You wouldn’t talk to ME like that. The problem with stress is that it compounds. The minute a stressful moment pops up in your work day, your brain automatically goes in to fight or flight mode. You either want to defend your every move or run and hide under the bed till the storm blows over. Thing is, all this stuff is – you guessed it – mostly in your head. Example – you have to present your design wireframes to your boss this morning. Your boss is notoriously picky. You walk through the scenario over and over in your head. You’ll walk in. Show him the designs. Your boss will go into a rant about how you don’t understand the concepts or even the basics of how this company is run, and you’ll never understand. In your head you think, “oh, here he goes again, what a jerk, he is such a bad communicator”. All the excuses why it’s your bosses fault. You feel like the worst designer in the world. You frickin’ no talent hack. Loser. You’ll never get anything right. The pit in your stomach grows. Oh wait. You haven’t even had the meeting yet, and you feel awful. Imagine if you said any of those things to someone else. A colleague, family member, friend. Calling them a no-talent hack. A loser. Of course, you’d never do that! But we do it to ourselves all the time. We make up entire scripts, play out the entire episode before something even happens. When you think about it, it’s pretty ridiculous. Next time you find yourself making up the script, stop. Take a minute. Think to yourself…would I talk to my Mom like this? My brother? My wife? My friends? If you can’t imagine doing it, then stop talking to yourself like that. Immediately. Stop making up stories and assumptions. If you go into a situation with a clear mind, 9 times out of 10 it will be more favourable than you can imagine. It works. Just try it. Like I said, we all have bad days. Sometimes it’s out of our control. But most of the time, deciding whether our day is good or bad is simply about adjusting our view. Get out of your head. Get into your life. Once you’re there, the possibilities are endless.
Web 3.0: Web 2.0 With Pants – Part II
This is part two of a two part post on Web 3.0 by my friend Don. You can catch up by reading part one here. Today, Don takes us into the present day – exploring the current state of the Web and provides insights on how we can meet the challenges of the emerging Web 3.0 era. In January 2009, Jessi Hempel of CNNMoney.com actually declared Web 2.0 “over” and welcomed the emergence of Web 3.0. Accordingly, Web 2.0 has been a “Total Bust.” Web 2.0, personified by social networking services such as MySpace, Facebook, Twitter and their enormous fan bases, are not exceedingly profitable. Myspace is projected to make $400 million in revenue less than projected. Twitter has no business model. Google has not been able to shift away from relying on advertising revenues. Its purchase of the popular video sharing service, YouTube, has not produced discernible profit. While Web 2.0 has changed the way people interact with the web (i.e. disruptive technologies), it has failed to produce equivalent financial success like its surviving dot-com predecessors, Amazon and Yahoo. The piece echoes Dan Farber’s 2 year old determinations. Few social networking companies have viable business models. Everyone is in the cycle of raising money and trying to “score a life- or business-altering hit”, potentially in the hopes of being acquired. Hempel also points out that many companies rely on advertising, which is problematic because, as a recently published Time magazine piece states, social networks break from the traditional model that brought marketers to the web. They do not permit advertisers to target by subject. While these services continue to gather large numbers of people together, they demonstrate themselves inappropriate to marketers because no one knows who users are or what they want. The result: an ad on Yahoo’s news portal commands 30 times the value of an ad on Facebook. Hempel’s piece ends with a longing for what Web 3.0 may bring. In 2007, Google CEO Eric Schmidt defined Web 3.0 at the Seoul Digital forum. After first joking that Web 2.0 is “a marketing term”, Schmidt theorized that, while Web 2.0 was based on Ajax (Javascript and DOM), Web 3.0 will be “applications that are pieced together.” Its characteristics will include relatively small apps. The data is in the cloud. The applications can run on any device. The apps will be very fast, very customizable, and distributed virally (social networks, email, etc). In short, the CEO of a Web 2.0 giant re-iterated O-Reilly’s definition of Web 2.0, characteristic for characteristic. “applications that are placed together” -> Characteristic 5 “Relatively small apps, whose data is in the cloud” -> Characteristics 3 and 5 “the application can run on any device” -> Characteristic 6 “The apps will be very fast, very customizable and distributed virally” -> Characteristics 1, 2, and 4 If Schmidt is correct, utility of web services drives their profitability, meaning Web 2.0 is still profitable so long as companies make add-ons, to provide novel functionality like location-based services or financial payment systems, that can be bolted onto existing services. Perhaps, Schmidt was motivated, like O’Reilly before him, to establish new jargon, distance his company from Web 2.0, and re-brand technology that Google is invested in. Nova Spivack, CEO of Radar Networks and creator of Twine, the first mainstream semantic web-enabled service, begs to differ. According to him, Web 3.0 is statistics, linguistics, open data, computer intelligence, wisdom of crowds, and user generated content, all coming together. It is the “natural convergence” of web innovations that have occurred to date. And yes, semantic web technologies will power the convergence. When Spivack released Twine two years ago, he explained how the newly popular buzz words “semantic web” fit in Web 3.0. Unlike web iterations, the semantic web is not jargon. It is a set of standardized technologies, designed to address features currently lacking in the web. It is meant to extend the web so that all information exists in a format that software can understand and reason with. In so doing, software can leverage conceptualized information (i.e. knowledge), making them seem more intelligent. An early realization involves services from the web driving more complete responses to natural language queries. Another is thinner applications that leverage semantic-aware services or structured information in a “knowledge commons.” As such, services from the web will not be restricted to information they themselves capture or store in traditional relational databases. At the moment, Spivack envisions the semantic web just merging additional meta data, in the form of RDF (resource description framework), into existing content. More complex ontologies using OWL (web ontology language) can come later. Meanwhile, like another semantic web pioneering firm called MetaWeb, he recommends navigating the noise about Web 3.0 by realizing that the semantic web presently resides in a niche. But, it will have effects on all aspects of the web. Spivack also recommends looking deeply into purported Web 3.0 companies who market semantic products to determine what technologies they use and how much expertise they have. It is now May 2009. The capabilities that I forecast in my original tech scan have neither been fully realized, nor have many of these technologies been popularized. While some smart phones have global positioning system (GPS) technology built in, very few mainstream services are location aware. Notebooks have superseded desktops, but netbooks have emerged, meaning that furniture with built-in consoles is still a while off (Microsoft’s Surface notwithstanding). While some credit cards use RFID chips to automate payment at the cash, adoption of product-based asset management has not been implemented. While Apple released a 17″ desktop replacement notebook with a reputed day’s worth of Lithium Ion power, long lasting fuel cells do not power our gadgets. As for web services, the promise of service oriented architecture was powerful, but its popularity has fallen to the wayside with the emergence of cloud computing. What we can draw from this in the midst of the web potentially going through yet another redacted effort to
Web 3.0: Web 2.0 With Pants – Part I
Over the next two days, I’m handing Suzemuse over to my friend Don. He has a really compelling take on the phenomenon of Web 3.0 and where this new media stuff is all headed. He starts us off today with a history lesson, on how we got from Web 1.0 to where we are now. Enjoy the post, learn lots and I encourage you to open a dialogue in the comments! In 2003, I was asked to write a piece, called a “tech scan”, on what impact then advances in technology would have on the justice sector. After spending several days reading ZD-Net, C-Net, and other corresponding reputable on-line sources, I settled on definitions for bleeding and leading edge technologies, linking them to Gartner’s “hype cycle.” I then chose specific technologies to focus on. Gartner is a multi-national consulting firm, whose highly paid consultants monitor technology and produce highly regarded projections for CEOs (Chief Executive Officers), CIOs (Chief Information Officers), and CTOs(Chief Technology Officers) to follow. My choices, pervasive computing (aka: ubiquitous computing), radio frequency identification (RFID), identification frameworks, fuel cells, and web services (system to system communications), agreed with Gartner’s predictions. I even put together a scenario for how these technologies could work together. When completed, I concluded that determining technology trends and forecasting their effects the way I did it was tantamount to standing against the tide. Sifting through mountains of disparate information on technologies to determine what promise they had was both labour and time intensive. At the end of the piece, I strongly recommended my employer dedicate resources to regularly monitor emerging technologies and leverage existing trend forecasts. It is far easier to put together a tech scan on the impact of advances in web technologies, as the web has three iterations already defined. They span past, present, and future. Web 1.0 begot what is presently Web 2.0 and Web 3.0 is emerging. Each iteration is not a version per se. The World Wide Web (www) has not been upgraded. To explain, let us start, somewhat counter-intuitively, with Web 2.0. This is because it is the only iteration that has an accepted formal definition, coming from Tim O’Reilly. O’Reilly, however, should not be confused with the English Physicist, Tim Berniers-Lee. Berniers-Lee is largely accredited with inventing the web in 1989. During the proceeding years, leading up to March 10, 2001, a rash of Internet-based companies, commonly referred to as “dot-coms”, were founded. They saw their stock prices multiply by simply associating themselves to the growing fervour around the web. Investors concerned themselves with increasing their market shares without any concern about profitability. Many dot-coms had neither discernible products nor services. Few had articulated business plans. Their goal: establish a recognized brand, claim market share even at a sustained net loss, and charge profitable rates for services afterward. The bubble burst, which lead to the emergence of Web 2.0. Pre-bubble, building web applications or designing websites was incomprehensible to the general public. Afterward, web application developers and web designers were treated with disdain. Those of us who left university with formal training in building software faced some hardship starting our careers post-bubble. However, the web continued to evolve. While the more resilient dot-coms struggled to re-invent themselves (e.g. Amazon and Yahoo), traditional publishers discovered a new digital medium in the web and found their way onto it, only to discover that non-traditional content was already present and thriving, the blogosphere. People had flocked to the web, using it to create new ways to communicate with one another and share ideas. The wasteland that was the web had again grown lush with opportunity. In 2004, O’Reilly, CEO of O’Reilly Media, held a brainstorming session with a trade show production company called MediaLive International. Its intention was to address the “state” of the web, determine its possible future, and demonstrate that it can again generate revenue. During the session, O’Reilly Media publisher Dale Dougherty coined the phrase Web 2.0. A year later, O’Reilly published a blog, soundly defining Web 2.0. Accordingly, it has the following characteristics: The Web as a Platform: services are built and delivered from the web (e.g. google) Harnessing Collective Intelligence: web both empowers user generated content and leverages that content to foster communities and harness collaboration respectively (e.g. blog’s, wiki’s, and RSS feeds from them) Data is the Next “Intel Inside”: data drives the services on the web, but successful services depend on embracing and enriching it with contributions from many sources (e.g. google maps) End of the Software Release Cycle: software is delivered as a service Lightweight Programming Models and Mash-ups: simpler solutions are built by leaner development teams who can innovate by assembling together existing solutions Software Above the Level of a Single Device: services are no longer limited to desktops or laptops and can leverage other devices like consumer electronics Rich User Experience: user interfaces have more dynamic interactions, but are based on open standards (e.g. CSS for layout, XML for data, XHTML for markup, Javascript and DOM for interaction) Essentially, he defined a piece of jargon that encompassed all of the new business models and service approaches that formed post-bubble, labelled everything that came before as Web 1.0, and dismissed Web 1.0 as antiquated. Based on his redactive efforts, the 1.0 incarnation of the Web began in 1990 and ended in 2000. Its 2.0 incarnation began in 2001 and continues today. In 2006, Berniers-Lee took issue with Web 2.0 in a podcast, saying that, as jargon, it overlooks the fact that Web 1.0 was equally about connecting people. The jargon is also interpretable. If Web 2.0 for a particular person is about blogs and wikis, then it is about people. This is precisely what the Web was initially designed for, a collaborative space where people can interact. In other words, people have been putting word to web, long before the emergence of the chronological blog or the permalink. The wrinkle is ease with which to publish online content and increased
$250 is what I want…
I’ve got Danny, John and Matt too. Be warned….this will end well. Love, Jack. xoxoxo
My Name is Jack
The Secret of Successful Communities
This morning I sat down over tea and cookies with two of the most interesting, creative and smart people I know. We’ve all been friends for years and years – since I was just a wide-eyed kid embarking on my first real TV job. My friend’s basement office was like some sort of secret lair, a place where you just know big plans are being made. We huddled over our shortbread. A large painting of our host was oddly juxtaposed behind him. The striking likeness watched over our conversation with interest, as the real man sat in front. It was at once surreal and familiar. We sat, sipping, and mulled over our own big plans. The whole experience set me on the following stream of thoughts for the rest of the day. One of the reasons why I am so interested and engaged with social media is because it revolves around the central theme of community. Whether you are building social media strategies for multimillion dollar companies, or creating a personal blog about your passion, you are striving to achieve the same thing – to build a strong, thriving community around something people care about. It baffles me that so many people look at community as being some kind of bold new concept, especially in conversations around social media. They talk about community building, community management, community awareness…as if it’s the next big thing, the next new wave. They totally over complicate the process, and the entire point of community gets lost in translation. I’m as guilty as the next person of getting caught up in the fad of community – especially as it relates to the online world. Community isn’t new. Community has been around as long as we have. Communities are not created by tools or technology…they are created by people. My two friends helped me to remember this today. One of my friends is working on building a new community. My other friend (the guy with the secret lair) has been building communities for years. Both of them are really really good at it. They’ve got it cased. Not only do they know how to build a community, they know how to make it strong. It doesn’t matter what kinds of communities they are building. What matters is that they aren’t getting caught up in the process or the buzzwords or the tools. They are just doing it. Want their secret? Well, okay then. Surround yourself with smart, passionate people. Then, you never have to worry that there won’t be anyone around to help. Just because someone has different tastes, doesn’t mean they don’t belong in your community. Give everyone a chance to try it out. A strong community is a diverse community. Strong communities have great leaders. Great leaders have strong communities. A community doesn’t take work. It takes passionate people who care about making it work. Sometimes I think we get so caught up in the technology and the next big thing, that we forget that really it comes down to people. It’s not hard to build a community. What’s hard is putting aside the gizmos for a bit and focusing on what kind of community you want to build. And then, just getting out there and building the darn thing. Thanks, two friends!
Music and the Geek Set
On Friday night, a bunch of local Ottawa bloggers got the chance to really get their geek on, thanks to Jen Covert and the National Arts Centre, as the Sci-Fi Spectacular hit the stage. Not only were we treated to a great performance by the National Arts Centre Orchestra, we were part of a select group chosen to attend a backstage meet and greet with none other than George Takei, a.k.a. Mr. Sulu from the orignal Star Trek series. My brother, Mike Thompson, is just about the biggest Sci-Fi fan you’ll ever meet. Well, until you meet my friend Mark Bell. I have to confess that I am not well versed on the Sci-Fi genre much outside of Star Wars and perhaps Star Trek:TNG, but Mark and Mike really were transported to another world by this performance. I brought along our new Flip MinoHD camera to capture some of the goings-on of the evening, and my brother edited it together into a great little piece that I think sums up our evening really well. He’s even inserted our tweets from the night throughout the piece. For anyone who thinks the National Arts Centre is a stuffy old place where only the hoity-toity go, think again. They are doing some really progressive stuff these days, with shows like this and all of their great new media ventures. What I really appreciate is that the NAC is reaching out to us bloggers to help them tell their story. And the consensus is, we’re happy to report that the NAC really is the place to be for arts, culture, and just plain fun in the Nation’s Capital. So without further ado…ladies and gentlemen….The Sci-Fi Spectacular. Update – here are some more Sci-Fi Spectacular photos courtesy of my pal @Elkae.
Want Real Success? Then Get to Real Work.
I’ve been observing lately what it is that makes someone successful. Success for many people means money, and sure, that’s part of it (who doesn’t love money?). I mean truly successful – fulfilled in work and life, unbelievably productive, surrounded by amazing people and love and positive things, and yes – financial stability. I look around me and it’s pretty easy to spot the people who are truly successful. You can look around you and do the same. I can also see some that are faltering. Look around – yep…there. And over there. This post is not intended to “call people out” or be self-righteous in any sense. I’ve been trying to determine for myself how to gauge success in my own work. What I’m sharing here are some thoughts that I’m using to guide that. I want you to look at the lists below, and put yourself into them. Go down each list, and put a little mental asterisk beside each thing you are doing on a day-to-day basis. It’s the day-to-day basis that’s critical here. Because what I’ve learned is, as much as it’s important to think long-term and big picture, it’s the actions you take every single day that will move you closer to your goals. Real Work Building relationships Planning. Creating. Delivering. (These three go together. Always.) Delegating Building Trust Maintaining Trust Thinking about what’s possible. Negotiating contracts Not Real Work Spending more time getting others real work than getting yourself real work. (i.e. being too helpful). Going to every social or networking event in town or all over the country just because everyone else is. Talking incessantly about the tools. Talking about how busy you are. Doing “busy” work just so you can be busy. Thinking about what’s NOT possible. How does your list balance out? Remember, think in the context of the activities you are doing every day. Do they constitute “Real Work”, or “Not Real Work”? If your balance is more in the ‘Not’ category, then I suspect you may be wondering why you are not getting any closer to those goals you set back in January. Now, you might be thinking…sure this is fine and well in YOUR industry, Suze. This makes sense because you are an entrepreneur, a media producer, whatever. This is what YOU need to do to gauge YOUR productivity, YOUR success. Try something. Pretend for a minute that you are want to be the most successful Bartender, professional hockey player, or cab driver ever. Go down the list, as one of those people, and put it in context of what that person is doing on a daily basis. If that list was unbalanced, with most of the stuff in the “Not Real Work” category, how successful do you think they would be? It’s not about how full your to-do list is. It’s not about the superstars you hang around with. It’s not about the tools. It’s not about the tools. It’s not about the tools. It’s not about being busy for the sake of “busy-ness”. It’s not about the reasons you can’t do it. What’s holding you back from real success?
2 Myths About Mainstream Media
Wow – people are really all a-buzz about mainstream media’s latest foray into the world of social media, eh? I’m seeing lots of different viewpoints and some great conversations. In fact, I’d say that social media is really showing it’s stuff right now – the community is out in full force, in one way or another, trying to figure out what it all means. I have been a media producer, in some form or another, for going on 20 years. In my time, I’ve produced media (television, radio, print, advertising, marketing campaigns) for everything from high profile, mainstream media outlets to large corporations to mom and pop shops and local community cable. I feel fortunate that I’ve been able to witness first hand what is likely the single largest transformation in the way people communicate since the invention of the telephone. With that said, I wanted to present my take on recent events within mainstream media and some of the social media community’s reaction to these events. I present here the dispelling of a couple of myths about mainstream media, based on my observations over the past several years. Please keep in mind that this is my opinion only. The purpose here is to state my position on the matter, and open up honest discussion about it. I invite you to dispel what I’m dispelling. Myth #1. Mainstream Media Doesn’t “Get It”. I’m seeing a lot of defensive behaviour coming out of the social media crowd in the past few days. They are right on top of celebrities like Oprah and Kutcher, accusing them of ruining Twitter, using it as a broadcast medium only, and the one that really gets me – not understanding the “nuances” of social media. It’s true that there are things people can learn about effectively using social media as a tool, and there are plenty of good, interesting, smart professionals out there helping people to figure it out. But anyone who thinks that a multimillion dollar corporation like Harpo has not done their homework before diving into social media with both feet is coming at it from the wrong perspective. The beef incident from a few years back, if nothing else, should prove that Oprah’s people most definitely have learned to do their homework. As for Kutcher and the rest of them, they’re all businesspeople too, with images to uphold, and a flurry of agents and publicists who have to damage control their careers if they say or do something stupid on the public stage. Mainstream media doesn’t get it? I beg to differ in a big way. In fact, they may get it a lot more than many of us. We’ve been stuck for a long time in the same rut with social media, talking about the same stuff over and over. They are in part, starting to move the medium forward. Myth #2. Mainstream Media is Missing the Boat. Ooh! I love boat analogies. And my husband came up with a doozy last night. He said “Big media is an aircraft carrier. It takes a long time to turn a ship like that around.” Let’s look at mainstream media in relation to the average social media superstar. Oprah has hundreds of millions of viewers for her TV show. According to Wikipedia, she gets 70 million page views per month on her web site. Social media superstars, even the really popular ones, are not working anywhere near that scale. Most are in the tens of thousands, and a few elite are in the hundreds of thousands. What does that mean? Oprah’s driving an aircraft carrier. So is CNN. Your average social media superstar is driving a speedboat. He can turn on a dime. I’m not saying he’s not putting a great deal of thought into strategy and planning and image and all the rest. Of course he is. But big media is strategizing and planning and considering image too. The difference is, they are doing it on a much larger scale. They aren’t missing the boat. They are just driving a way bigger one than the rest of us. And once they get it turned around – and it’s about 3/4 of the way there in some cases – then the game is going to change for good. So – I guess what this all means is – are you ready for things to start changing? Instead of defending the models that have been created over the past few years, how are you going to adapt your model to the changing tides? Or maybe you don’t think big media changes things at all. They’ve just shown up at the party, and are going to mix with the crowd. What’s your take?
Why It's All About Stories
I met someone for the first time recently. They asked me “What do you do?”. The story of what I “do” has changed over the years. I used to just say “I work for [insert company name here].” Then after a while, I started to identify more with the actual work I was doing. “I’m a TV producer.” “I’m a web designer.” “I’m a technical writer.” “I’m a professor.” When we started the company, my description started to become more complex. “I’m a partner in a production company.” Which always prompted the response, “Oh? What kind of production?”, to which I’d have to go into a long-winded explanation that well, we do tv production, but also web video, corporate video, and oh – yah, we develop web sites too, and do marketing….. The problem I was having is that I could no longer easily sum up what I actually “do”. I was “doing” too many different things. Then the other day, it hit me. What I do is irrelevant. It’s a task list. It’s what I am that matters. So what AM I? Simple. I’m a storyteller. All day, every day, I’m telling stories. I’m telling them here, on my blog. I’m telling them in 140 character increments, on Twitter. I’m telling them when I meet my friends for drinks or coffee or dinner. And I’m helping my clients figure out how to tell their stories too, whether it’s through a new web site, a video, a TV show, a marketing campaign, or all of the above. It’s all storytelling. So how does this tie in to the notion that it’s not about the tools? Well, the fact that it’s really all about storytelling rather proves it, I think. If what we’re all doing out here is telling stories (our own, our clients’, others’ -it doesn’t matter), then really, the tools become irrelevant. Sure, it’s important to know how to use the right tools, and use them the right way to ensure the story is told well. But the tools themselves are secondary. It’s the story that really matters. When all’s said and done, it’s the story that people will remember – not what you used to tell it. I’m wondering if it’s time to reconsider how much effort we are spending talking about the tools themselves, and if perhaps we should be spending more effort figuring out what our story is and how we’re going to tell it. Thoughts?