January 22nd 2016
The good folks over at Real Estate Tech News gave us a great review this morning.
Check it out here! If you’re interested in learning more about our IDX and app services for real estate agents, feel free to chat with one of us in the talk window at the bottom of the page.
December 10th 2015
Inman wrote a great piece about Wovax this morning. Check it out here: http://www.inman.com/2015/12/10/wovax-is-re-inventing-the-way-agents-use-their-idx-feeds/
“This is really unique software…Wovax is offering agents an easy, affordable way to capture mobile-savvy home shoppers. When coupled with a listing-rich website, it becomes an effective end-to-end lead generation strategy.” – Craig Rowe, Tech Columnist at Inman
And while you’re at it, enjoy a discount on your setup here at Wovax with december2015.
May 13th 2015
When you think about Disney, you probably think Mickey Mouse and fairy dust. But Disney has been doing pretty well for themselves lately, and that’s partially because they’ve been reinventing themselves not only as a brand, but as a business. A very tech-savvy business.
Disney’s current technology kick can be traced back to the 1980s, when computer generated imagery (CGI) was just taking off. One of the biggest supporters of CG in filmmaking was George Lucas himself, the creator of Star Wars. His company Lucasfilm had a small computer division called the Graphics Group (later Pixar) that mostly justified its existence by selling high-end graphics software and computers to medical imaging companies. Lucas was supportive of the team and could see the future of CGI but in 1983 he and his wife Marcia went through a particularly disastrous divorce. The same year also saw the original Star Wars trilogy come to a close (and thus the merchandising sales drying up). Lucas no longer had the financial stability to support Pixar, so he began looking for a buyer. Specifically, someone who could afford to let them keep working away until good things came about.
Pixar’s savior came in the form of Steve Jobs, who had just gone through a divorce of the corporate kind that saw him ousted from the company he created, Apple. Jobs had money to invest and saw a kindred spirit in John Lassetter, the leader of the Pixar crew. For almost ten years, Jobs funded the Pixar team as they honed their technology on corporate commercials as well as short films. He was surprisingly hands-off, realizing that animation wasn’t his forte. He knew the Pixar team was passionate and didn’t need his infamous micromanagement. Jobs was always willing to back Pixar with his vicious boardroom manner whenever Disney, who was producing Pixar’s first film Toy Story, wanted to pull the plug on the seemingly doomed project. Pixar prevailed and Toy Story became the first fully computer animated feature film in cinema history which led to a film distribution deal at Disney.
Jobs also found something of a Bill Gates equivalent in hotheaded Disney chairman Jeffery Katzenberg. Katzenberg is often credited with leading the studio out of its 80s slump into what has been dubbed the “Disney Renaissance” (he was responsible for hits such as Beauty and the Beast and The Lion King). Katezenberg left Disney in a rage just when Pixar was getting its footing; he was passed over for a top position after Disney’s president Frank Wells was killed in a helicopter crash. Shortly thereafter Katzenberg rushed his newly-minted Dreamworks (co-founded alongside Steven Spielberg and record mogul David Geffen) to their first CG film. This project was Antz, a none too subtle effort in undermining A Bug’s Life, which Disney and Pixar were developing when he made his loud exit.
Throughout the next decade, Pixar grew unhappy with Disney. Though Katzenberg had often been frustrating to deal with, there was no denying that he was an animation guru. Under CEO Michael Eisner, Disney was not only unable replicate the string of hits that Katzenberg had produced, they also began to focus heavily on low-rent sequels to hits from decades prior. Eisner became so unpopular that Roy Disney (Walt’s nephew) began a shareholder revolt to have Eisner ousted from the Mouse House. Pixar and Jobs were also frustrated with the new Disney and made it known that they would be seeking out a new distributor for their productions once their contract was up.
This was when Bob Iger showed up. Iger, who had been the president and COO at Disney for a few years at this point, was a problem-fixer who had great aspirations. Iger and Jobs became such fast friends that when he eventually sold Pixar to Disney, Jobs gave Iger a chance to back out of the $7.4 billion deal by confiding something that not even his family knew of – his cancer had returned and he only had five years left.
Under Iger’s drive, Disney acquired Marvel and Lucasfilm (for a cool $4 billion each) which gave him access to some of pop culture’s most iconic characters. The acquisition-based business model of the tech world that Jobs hailed from was undoubtedly rubbing off on him. (The two would have meetings to talk about who they could buy and revamp into something greater. Yahoo was in their sights at one point).
Now Iger is looking a little smaller, but with just as much enthusiasm and potential for growth. Last year saw Disney jump into the startup incubator pool with the Accelerator Program. Each year, ten startups from around the world are selected to come spend three months working with entrepreneurs and tech-minded Disney executives to bring their ideas to full potential. The ideas that Disney has shown interest in are ideas are those driven by technology or software that has interactive and entertainment capabilities. The companies still own their intellectual property when they leave the accelerator in three months and are free to continue on their own, usually wiser and richer with the experience and spotlight. Disney seems to see this as an opportunity to have the first crack at new and emerging technologies before other competitors, as well as a way to encourage growth beyond the typical startup avenues. There’s very little fine print.
The Walt Disney Company seems to be taking a leaf out of Apple’s book. Like Apple, they have had a very successful run after a period of less than desirable results, and a good portion of that has been in smart buying (they also own ESPN and ABC in addition to Pixar, Marvel, and Lucasfilm). Since the earliest days when Walt himself was crafting the first animated feature with color and sound, the Disney mission has remained the same: explore the bleeding edge of new technology with the ultimate purpose of creating compelling characters and telling a satisfying story.
April 15th 2015
As devices become smaller and more powerful, one of the things that has been taking a hit is the battery life. Apple for example has enjoyed success and invited scrutiny with their latest iPhone models and their Watch respectively. What we have is great, but it’s being pushed about as far as it can as electronics become thirstier that ever. Even the new MacBook had to sacrifice space for computing power to fit what ended up being a multi-compartmentalized battery, rendering it even less powerful than a MacBook Air. The quest for thinner is now at the impasse of choosing between sleek looks and computing muscle.
Some of the most interesting battery innovation right isn’t coming from trying to squish something into a smartwatch. It’s in the electric car industry. Tesla, the company that made electric cars both practical and cool, is working to bring an enhanced model of their car batteries into homes. They’ll probably be announcing them at the end of the month. Combining these batteries with solar power will be a boon to anyone trying to cut costs on their electric bill. That’s pretty cool, but how does that help your smartphone battery?
Let’s take a step back for a second ask ourselves, what if we applied the same thinking to power sources that we do to other areas of computing? Take cloud technology for instance. At this point, it’s ubiquitous. Wireless networks are omnipresent and it’s rarely inconvenient to rely on them. Computers are increasingly just windows to the troves of information that we store elsewhere.
Now let’s take these building-powering batteries a step further with wireless charging. No more dangerous sockets for kids to zap themselves with. Whether you’re at the office or a coffee shop, your seating options won’t be limited by a dying laptop battery. You’ll even notice a difference with the extra freedom you’ll gain in arranging your furniture where you want. Oh, and guess what? Wireless charging is totally real. Starbucks has been testing wireless charging mats in their stores, IKEA is working it into their products, and numerous products such as the Vessyl come with wireless pads that can charge the device when placed on top.
So, yeah. Wireless charging! How does this affect your device’s battery life? The most likely way seems to be similar to how your smartphone’s internet access works. To get faster speeds that don’t munch into your data plan (and to save precious battery life) you probably use a wi-fi connection whenever one is available. Wireless charging would work the same way. Your devices, as long as they are in range of a battery that is broadcasting power, will be connected and charging. Once they are out of range, they revert to the onboard battery. And wireless charging is safe; no more dangerous than a cell phone.
Wireless charging is likely to first take off with electric cars as well. Tesla knows that plugging in a car is a monotonous task that will become easier and easier to forget. Think of how many times you forgot to charge your phone before bed. Now imagine that you ride your phone to work. Elon Musk, Tesla’s CEO (and 007 villain in the making) is experimenting with robotic snake plugs that attach themselves to your car, but you can bet they’ll go wireless as soon as they can.
March 4th 2015
Kickstarter has produced some whacky projects that have really taken off. Everything from card games about exploding kittens to some dude who got a serious case of the munchies has been successfully funded. It’s become its own Shark Tank reality show in a way; for every good idea, there’s a ton of stuff that would make even the most jaded SkyMall customer blush.
In 2012, startup Pebble broke the Kickstarter monetary record for its original run of smartwatches with over $10 million in pledges from over 68,000 people. (Their target was $100,000). Pebble became a hit and has now shipped over 1 million watches. Now Pebble is back on Kickstarter with an updated version of the watch called Pebble Time. In just a week they have rocketed past their goal of $500,000 with over $15 million, once again responsible for the most funded project on Kickstarter. I haven’t had a change to play with one in person, but the Pebble does seem to be a pretty cool little gizmo, especially for the price point.
The Pebble is definitely a grassroots hit; and with an almost too simple look, complete with e-ink display, it’s arguably more minimalistic and devoid of bells and whistles than the forthcoming Apple Watch from the company that most prides itself on sleekness. The Pebble Time is already being called the Apple Watch killer, despite the fact that neither device has been released yet. A huge deal for both is battery life. The Pebble Time is boasting a rather impressive 7 days between charges while the Apple Watch is going to have to be recharged every night, assuming it makes it through the entire day (because – let’s be honest here – people are going to try and watch YouTube videos on it and be disappointed when that doesn’t work out so well).
I’m willing to give both a chance. At this point, they both suffer from the biggest con of all – they’re both smartwatches. I’ve seen like two smartwatches in person, ever. At least one was just a cheap LG or something. And while the Pebble doesn’t give me the same skin-crawling “calculator watch” feel in terms of design that pretty much every other smartwatch does, it still looks like something you might get in a Happy Meal. The Apple Watch, on the other end of the design spectrum, looks nice but still feels a little too flashy. Apple is known for making products that declare themselves to the room in their smug “too cool for school” Jonny Ive way, but you also haven’t attached one to your wrist yet. It would almost be worth waiting a year to get one just so that every time I try and make a point in a conversation my watch doesn’t become the topic. (I’m not Italian, but when I’m deep into an animated talk, my hands definitely are).
People like everything divided up into neat categories of black and white, right or wrong. In the realm of mindless fun, like sports, it makes sense. X sports team is obviously better than Y sports team because they’re my favorite! When you let this mindset loose into actual real life, (like say, making politics exclusively Democrat vs. Republican) it frankly makes you look like an idiot. I’m a fan of Apple products, so when a dyed-in-the-wool Windows or Linux user tries to bait me into a fight about “which team is better,” my response is kind of a shrug of “everyone needs a different product for different things.” This was especially confusing to people who would try and pick computer fights when I worked at an Apple store.
People are obsessed with the idea of “better,” and most of the time it comes from an insecurity that they’ve made the wrong choice somehow. They need validation that where they’ve put their money is indeed The Best. But here’s the thing. It doesn’t actually matter. If you like it, and it does what you need it to do, then for you it is The Best. Who gives a damn what everyone else thinks? I have better things to do than argue with someone that Apple makes better computers than everyone else. For me and my needs, that is true. For others, that may not be. Same goes for Android vs. iPhone, Pebble vs. Apple Watch, and so on.
Pebble is probably not going to dip into the Apple Watch sales any more than the Apple Watch will dip into the Pebble’s sales. As far as I can tell, they’re angling for different kinds of consumers. And that’s great. A healthy marketplace will always have room for rivals. Not only does it give consumers more options, but it keeps each company on their toes as far as constantly innovating. Comparing them at a certain point is all about personal preferences and it gets pretty nitpicky. Don’t get too worked up about what the other guy is using. Just choose what works for you and have fun with it.
February 25th 2015
Every generation has that guy standing on a street corner with a big ratty DOOMSDAY sign. Sometimes, like in the recent Cold War era, everyone else joins him. For a while, terrorism was Fear’s employee of the month, but that hot streak has cooled by now. It’s still around, but its desk got stuck in the basement with Milton from Office Space. There’s a lot of fringe-sounding ideas getting talked about in mainstream media these days since big tech moguls (which tend to be a pretty eccentric bunch) are getting a lot of face time and becoming household names. One of those fringe ideas is artificial intelligence.
A few well-known moguls who have expressed rational levels of concern without being too much of a buzzkill about it; Bill Gates and Elon Musk are both interested in AI development, but only if it is done with great thought. Stephen Hawking is less thrilled by the idea and has said that AI will spell the end of the human race.
AI’s most tangible threat is currently unfolding in low-level jobs where it is already being implemented to replace human employees.
While jobs that require a very specific and trained set of skills are often aided by AI, they still employ highly trained individuals. Less glamorous jobs, such as those in factories and manufacturers have already started to replace humans with automated robots. More and more, any job that doesn’t allow for thinking on the fly or outside the box is going to be targeted. The nature of the work being targeted for replacement is usually repetitive and robotic, such as that in manufacturing facilities. The average, everyday job seems to be safe – for now. According to Bill Gates, as artificial intelligence develops, it will start to edge out middle class employees as well.
But after that warning, Gates hypothesizes a refreshingly optimistic solution. He says that while concerns arising from the replacement of people by machines are valid, this is a chance to improve education and help people survive displacement by forthcoming robotic workers. After the recent financial crisis, economic downturn and job loss is more of a relatable and worrisome threat to the current working generations than the vague always-impending physical destruction of the world that was prevalent in the Cold War (and which is usually the bread and butter of AI doomsayers). If AI does pose any threat in the immediate future, it is in socioeconomic areas such as this.
Foxconn is a massive Chinese electronics manufacturing corporation that is notorious for the slavish treatment of their 1.2 million workers. Conditions are harsh and hours are inhumanely long, forcing many employees to live in crowded factory dorms just they have time to sleep, much less afford a place to do so. In 2010, 14 workers committed suicide by jumping off of factory rooftops. Foxconn’s response was the installation of suicide nets. Many of the companies using Foxconn’s services have turned the heat up in regards to the treatment of workers, including Apple (which is currently bringing manufacturing back to the United States – they opened a facility in Houston in 2013 and another is being planned in Arizona).
In 2012, Foxconn invested in 10,000 robotic arms that would replace workers at $25,000 a pop, with 20,000 more on the way. The plan has started to backfire already as the robotic arms are not precise enough for many of their products, requiring them to hire back as many as 100,000 workers.
Disposable treatment like this says a lot about the business environments that are driving a demand for workers to be little more than robots. Companies like Foxconn are just as ruthlessly efficient and demeaning to humans as any potential human-hating AI singularity would be. Bill Gates is right. If AI software is advancing fast enough to shuffle people around like a pile of computers, then the best way to prevent any AI-related troubles in the job field is in creating better opportunities, training, and education. AI is just a complex tool. And for the foreseeable future, the most dangerous thing will be the people wielding it. And if those people continue to treat other people in this way, any future AI they create will be just the same.
January 28th 2015
One of the most talked about aspects of last fall’s new iPhone lineup was the Apple Pay feature. Using contactless NFC (near field communication) technology built into your phone, this software allows for making payment without a physical credit card. And by using your fingerprint instead of an easily stolen PIN, verification is more secure. Staying true to the form of the best technology streamlining simple tasks to be even simpler and aiding in the decluttering of our lives, Apple Pay was a hit.
Much of the appeal for the upcoming Apple Watch comes from it featuring NFC as well, allowing you to pay with a swipe of the wrist.
Critics were quick to point out something; Apple didn’t invent NFC technology. In fact, it’s been present in phones for quite a while. Nokia started introducing NFC in 2006, and Samsung has been using it in their phones for several years now.
Words like “steal” get tossed around quite a bit in tech these days. Sure, tech companies are always at each other’s throats in bitter court battles, but most of that is a formality. But the heart of a thriving market has always been in competition, and that will only come about when companies try to out-do each other. For the most part, this will work out well for everyone involved. In trying to beat the competitor, a company will push itself beyond what it thought was possible and ideally create the best product it can. To that end, customers will end up with a selection of top-notch products or services that they can choose from.
Apple didn’t invent the computer. Instead, they made it more accessible to the general public. They didn’t invent the MP3 player, but instead used the classic “razor and blades” approach with the iPod and iTunes store respectively. Apple didn’t invent the smartphone, but they once again made it accessible and approachable to the average consumer. These innovations were so successful that in 2007 the company changed its name from Apple Computer, Inc. to a simpler Apple, Inc., thus signifying a shift to consumer electronics in a broader sense.
This isn’t a “let us now bow down and worship Steve Jobs” article. But as it stands, Apple is currently the world’s most valuable brand. Even if you hate them, you have to admit that they are clearly doing something right. So what is it?
The two things that Apple has gotten ridiculously right are branding and environment. I’ve talked before about the anecdotes I collected working the sales counter in an Apple retailer and I’m still amused by the amount of people who would complain to me about how much they hated Apple because of how expensive it is, or because Steve Jobs was a jerk, Apple is about to start losing money, etc, etc. Joke’s on them though, because they still came in to buy a $2000 Facebook machine. Oh, and Apple just had the most profitable quarter of any company in history. As in, they blew past the previous record quarter which was held by a natural gas company. Right after “everyone” said their gargantuan iPhones would fail. But negative critics are often the loudest. Most people are happy with the products that Apple offers. Those kinds of numbers don’t lie.
One of the biggest things they have gotten right is branding. Their logo doesn’t say “Apple” anywhere. And yet, you see it and just know. Apple. It’s a McDonalds/Nike/Starbucks level of recognition. That’s impressive. The reason for this is the overall branding of Apple’s products as not just a computer or phone, but also as a lifestyle choice. An Apple product is more than a chunky plastic workhorse. I’ve been in many – I use this term lovingly! – snobby people’s homes that look like something from a magazine or Pinterest board and have often seen a large iMac front and center in the living room, frequently in place of a TV. These are well-designed products and pleasant to look at, not things that get tucked away in a home office. People like to be seen with an Apple product. The pride in the design on both the part of the company and their customers is a large reason for the company’s success and the Apple logo represents that.
The other aspect of Apple that has given them a solid edge over their competition is the environment they created. Apple has their own ecosystem from the moment you buy the computer to every time you use it to rent a movie. Their products all work together seamlessly and require no third parties. The strength of Apple’s ecosystem extends beyond the retail aspect of retail Apple Stores and iTunes. It is inherent in the software as well. Both the desktop software (OSX) and the mobile software (iOS) are built in-house at Apple and in collaboration with the people who design the products. This gives a certain level of seamlessness that no one else can boast of. Android for instance has to work on many devices made by many different companies. This means that glitches and bugs are going to be more common. Many complain of Apple’s closed off ecosystem, but it certainly serves a purpose in functionality.
So why did no one seem to care about NFC until Apple jumped into the game? Because many other tech companies have a history of putting out half-baked prototypes that don’t make enough of an impact to stick around. Apple has a reputation for taking its time and putting out a product that will integrate easily with the rest of its carefully constructed environment.
Is this fair to other companies? Of course it is. Companies like Samsung, Microsoft, and Google are successful in their own way. They’re all worth billions of dollars as well. They’ve all found their own niche and taken it. But Apple’s niche happens to be the one with the widest reach: taking a complicated tech product that others haven’t quite nailed down and making it accessible and understandable to everyone.
January 14th 2015
The most significant technological advances are the ones that allow our lives to become more efficient. This in turn frees up more hours during the day for activities beyond basic hunt-and-gather survival. Mass distribution of information – from Gutenberg’s press to WordPress – have allowed information to be freely generated and accessed, thus improving people’s ability to educate themselves. Things that were once time-consuming are now automated. In fact, a good way to tell if a new invention or idea is going to stick is this – will it create more hours in the day for the average person?
Someone is always going to make an objection about putting too much faith in technology, and it’s good to have those checks and balances. But the fact that it’s commonplace (and safe) to sit in a metal tube with hundreds of other people 30,000 feet up in the sky while we zip along at 600 MPH shows that what we’re already capable of is not any more extreme or dangerous than other scary-sounding possibilities. Like self-driving cars.
This past weekend my roommate and some of his buddies where watching the 2004 movie I, Robot (which has held up quite well for a brainy Isaac Asimov story that got Will Smith’d harder than anything since Bad Boys 2). Watching the movie’s driverless cars in action after a week of reading about this year’s CES was surreal because what I was seeing in a decade-old action movie and that day’s headlines were essentially identical.
I’ve admittedly been a slow convert to the self-driving car idea, mainly because of safety concerns. But as more companies have developed safety implementations in their car’s software and sensors, it becomes more reasonable to accept. It’s easy to be cynical about this – why would we be dependent on a computer to tell us where to go? But we’ve been doing that for a while. While there’s obviously a human element to machines such as airplanes and visually-impaired vehicles such as submarines, the human operators are still dependent on radar or sonar. We’re already trusting these machines not to break down everday, so why not take that a little further? (I know that sounds like something the greedy tycoons in Jurassic Park would say, but bear with me). Our safety track record is honestly pretty good. We sent people to the moon and brought them back alive nearly half a century ago. By the time any self-driving vehicle legally makes it to the marketplace, it’ll have gone through so much red tape and testing that it will become the safest way to travel by far. Will there still be accidents? Yes. There will always be accidents. Unfortunately, that’s just how this world works. But there will be less.
To me, one of the most exciting aspects of self-driving cars is the potential for regaining lost time. For people in large areas of urban sprawl, spending a few hours in the car everyday is a fact of life. When you think of all the man-hours being burned away in a driver’s commute, those numbers really add up. Automated processes take time away from menial tasks and allow people to spend more time with the families they are providing for as well as on their own personal self-improvement. Automated cars will free up time for any kind of work or activity you would normally do on a train or bus commute.
There are challenges ahead, of course. Most people assume they’re the best driver on the road and any problem lies with “other people.” Convincing a population that equates turning 16 and the freedom of driving yourself around with being an adult is going to be tricky. Potentially, the biggest hurdle will be the unknown factor of a computer driving your car. Legally speaking, things will get complicated. What happens if your driverless car does cause an accident or hit someone? Who’s to blame? Road laws and insurance policies will have to be completely rewritten. If self-driving cars do drastically reduce accidents, what will that do to traditional driving? Will driving a manual car become illegal? Should it? Issues like this will cause actual practical implementation of this technology to be very difficult at first.
Different companies have different approaches right now but they’re all aiming for the same target. The majority are aiming for a mostly-automated driving experience (particularly useful for highways) but will allow for human intervention as well. Tesla Motor’s CEO, Elon Musk, estimates that within this year 90% of the driving time in their cars will be automated. A few, like Google, are going bold by completely excluding steering wheels and acceleration controls in favor of only a touchscreen GPS system. In a similar fashion to Google, Mercedes debuted a high-class concept car at CES last week that allows the front seats to swivel 180º so all the passengers can face each other and chat, old-school carriage-style.
There will always be things that go wrong, but advanced technology like this will help eliminate dangerous factors and make driving a safer experience for everyone. And in the new “internet of things,” smart cars are undeniably poised to be the next big thing. The changes will be gradual, but they’re on the way. And since most of these driverless automobiles are going to be hitting the marketplace after electric cars have had a few more years to become ubiquitous, it’s safe to say that the entire automobile industry is going to be a completely new beast in just a few short decades.
January 8th 2015
Right now CES is going on and that means there’s some pretty neat tech on display. It also means a lot of this. Tech companies have a unique dilemma these days. They want to provide consumers with a product line that is constantly innovating and changing. But they also want to deliver something of a quality that won’t be wasted on an item with such a short shelf life. Currently, the tech industry is overrun with so much stuff. Right now gadgets have a remarkably short lifespan, partially because the industry is still new and devices are outdated within months of their release. This has led to a mindset of disposability that is shockingly common for products that cost as much as these do.
With relatively new industry processes such as mass-production, a modern company can manufacture products and put them in any number of retailers they wish, generally limited only by their budget. Consumers have a buffet of choices when going tech shopping. For the most part, they’re choosing between dozens of the same thing with different packaging. Inevitably, there is now consumer fatigue. Not only is there an over-saturation of tech, but a lot of it is lacking in the quality department as a physical product as well as just being well-designed. It’s been a deluge of mostly shoddy products and consumers want relief.
Think of a well-designed product like computer generated imagery (CGI) in a film. A movie like The Avengers is fun to watch and impressive in the scope and size of its battles and flashy effects, but you know what other movie had a ton of CGI and won an Oscar for it? Forrest Gump. When it comes to an effective product, you want Forrest Gump. Unassuming and effective. The design is there front and center, but also invisible. Design tells you how to use the product without ever having to read instructions or watch a demonstration. It should be natural and intuitive, without drawing too much attention to itself.
When you have a product being used for a dozen different everyday tasks, it pays to have something that lasts and makes an impression. A company can make a fast buck and crank out something cheap, satisfying themselves and their consumers for a short time. But eventually people become dissatisfied with their hollow plastic phones shutting down after a year. They’ll want something that has a bit more weight to it that can be used for more than a couple years at a time. Striking a balance between a solid product line and innovation without making a cheap disposable toy is crucial.
Soon we’ll see less frequent releases of tech products in general. The ones that we do see more infrequently will be longer-lasting due not only to their build quality being less disposable but also because big innovations will come less and less frequently within current industries. The practical reason for this is that we’re coming to a stabilization in new industries like smartphones and tablets and they will start to have longer life-cycles as a result. The next innovations will then come along, create new industries, and the whole process will probably start over again.
December 3rd 2014
Advancements in technology are an undoubtedly great thing. To us, the most appealing aspect is the accessibility. The thing that makes technology advancements so powerful is that it takes the unusual and fascinating and makes it available to everyday “normal people” like you or me.
In the days of the printing press, mass communication became much easier than ever before. But you still had to have a printing press, or at least access to one. Today all you need is access to a computer. And if cash is tight, you can buy a Chromebook for $200 or use a buddy’s. Manufacturing a product required a heavy amount of capital in the funding of said product and getting it to to an assembly line. Now, with access to a 3D printer, prototypes and products can be printed on demand.
This leveling of tech is a great time to live in. As an example, 3D printing has actually been around for several decades, but because of patents in place by the companies that innovated the tech, it was unavailable for use by anyone else. Technology that should theoretically be affordable (in the $500 range, the price of a cheap computer) was being built for five figure price tags and being sold to a very niche market. This was the reason that Elon Musk of Tesla Motors patented their innovative electric car technology but gave full reign to others in using the same discoveries. He was concerned that less ethical companies would swoop in and patent things for the purpose of sitting on them, thus inhibiting the entire industry. (This happens in just about every industry all the time. Hollywood studios will often buy risky scripts with no intention of making them because they’re scared someone else will do it if they don’t).
Tech done right gives everyone, regardless of their place in a societal caste, the ability to make themselves heard as well as the power to create. Tech this powerful is hated by influential people who aren’t willing to adapt. Oppressive governments hate the Internet because it lets a downtrodden population see the world through an unfiltered lens. Big oil companies and established car companies hate electric car makers like Tesla because it threatens their comfort zones.
The internet has shown us the advantage of open source software. By putting code into the public domain, a fantastic and fully-polished product emerges, always evolving at the hands of the world. Blender is a powerful 3D modeling software that is considered among the best in the world. Mozilla is a non-profit organization that backs open source projects such as Firefox, one of the most widely used web browsers in the world. WordPress, which we are obviously big fans of, is open source as well. Physical tech is headed this direction as well. 3D printing is going to not only open up a whole new world by allowing manufacturing to become more personal but it will also force large companies to change as they face new challenges in battling piracy. “You wouldn’t download a car” isn’t all that hyperbolic anymore.
This is a good thing. Competition is good for business. But when your competition is just a couple of other multimillion dollar corporations who are also playing it safe, then entire industries stagnate. A guy selling cars running off solar-powered batteries is a problem for all the other car manufacturers of the world because staying in business will require them to, you know, work hard and take a risk. Musk putting Tesla’s patents up for grabs while still keeping them safe in a legal sense is a first step towards a world where knowledge is something to be shared and used to make life better for everyone.