The Charitable Side of Technology

March 25th 2015 in Blog

One of my favorite authors in high school was Michael Crichton (like it was for many people, Jurassic Park was my gateway drug). I’ve always appreciated Crichton’s stories because he never swayed to one extreme or the other in exploring the implications of cutting edge ideas. As a scientist himself, he understood and respect the importance of innovation, discovery, and learning. But he also knew that the entire field is Pandora’s Box. Once something has been unleashed on the world, you can never take it back. That’s why criticism of a work like Jurassic Park as being impossible misses the entire point of that work. The story isn’t actually about cloning dinosaurs. The cloned dinosaurs in the story are a metaphor for the potential consequences of unchecked advancement and discovery fueled by ego in place of goodwill. But rather than condemn advancement and discovery, he understood that it was like any other force or object. Depending on who wields it and why, it can be a tool or a weapon.

"Genetic power is the most awesome force the planet's ever seen, but you wield it like a kid that's found his dad's gun." - Jurassic Park, written by Michael Crichton and David Koepp

“Genetic power is the most awesome force the planet’s ever seen, but you wield it like a kid that’s found his dad’s gun.” – Jurassic Park (1993). Written by Michael Crichton and David Koepp, directed by Steven Spielberg

Possibly the greatest asset that technology has is to be charitable. And currently many notable advancements being made in the tech world are those that spring from a charitable mindset. Once the “wow” factor and cost go down, there are many ways that new inventions can improve the quality of life for millions, and especially for the less fortunate. That’s why we now live in a time where ailments such as blindness and deafness can often be cured through the use of relatively inexpensive implants. The guys who cashed in on the early days of computing and internet advancement – Elon Musk, Richard Branson, Jeff Bezos, and Bill Gates are a few prominent examples – have turned their attention towards cultivating advancements that aren’t going to yield immediate results but instead lay the groundwork for future generations.

Bill Gates was recently asked in an online Q&A about the use of technology to extend lifespans. He pointed out that it is selfish for people like him to research in that area while there are still diseases that have been wiped out from most first world countries that still ravage third-world ones. And Gates has been willing to put his money where his mouth is; since leaving his leadership positions at Microsoft, he has put the same effort he did into advancing home computing into helping people in developing nations. One of the most important projects he is funding is new sanitation solutions that not only keep homes and streets clean, but also make efficient use of the waste energy solutions and resource recycling.

Which

Which is a nice of saying “it makes water for you to drink from poop.” Check it out, though; it’s actually fantastic.

Crowdfunding has shown that a large group of people from all over the world can each provide a small piece to create something huge. The e-NABLE group is a community originated by Jon Schull, a professor at the Rochester Institute of Technology that connects people throughout the world who are in need of prosthetic limbs with people who can use their 3D printers to help design and build low cost alternative. Children especially are often deprived a substitute because not only are full blown robotic limbs prohibitively expensive, they will be outgrown multiple times until the child reaches the end of their growth period. E-NABLE’s alternatives are around $50 for a hand and $150 for a limb and can be customized and replaced part by part for almost no money as the child continues to grow.

Don’t be afraid of technology. Instead, be wary of those who use it exclusively for personal gain and support those who use it to improve the lives of others.

 

Buying a Concept: The Rise of Crowdfunding

March 18th 2015 in Blog

The thing that makes the Internet great is that it breaks down the gatekeepers in a variety of industries. In fact, it doesn’t even know they exist anymore; they’ve been circumvented and don’t usually realize, or admit it, until it’s too late. Making a product for commercial mass-production is a huge challenge. Unlike a many other endeavors, like writing a business plan or a novel, there’s overhead inherent in creating a prototype to even pitch. Generally speaking, the people who are going to be investing big bucks are going to be older. They often may not have the interest or knowledge in something too unusual or tech-oriented.

Enter crowdfunding. Crowdfunding has always been a great way to get a decent amount of funding with little commitment from backers. A great example is filmmaker Darren Aronofsky soliciting donations of $100 each from friends and family to get the $60,000 he needed for his first movie Pi.  The film made over $3 million and his funders all received back a promised sum of $150, plus a credit on the film. He is now considered one of the best contemporary auteurs of cinema and his last two movies, Black Swan and Noah, made over $300 million each.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps Noah ark darren aronofsky

There’s an idea; let’s crowdfund one of these suckers.

Of course, not everyone has people hanging out nearby who can fork over $100 just like that. And even that project took five years to fully fund. What’s great about crowdfunding when it becomes a more streamlined process is that you are essentially pre-ordering a product. And putting quite a bit of trust that someone you’ve never met is going to be able to deliver.

There’s also the added benefit of accountability. While Kickstarter does their best to screen projects, they makes it clear that you are responsible for how you fund with your own money. Even with the added benefit of crowdfunding through a legitimate service like this, there have been a few high-profile snafus involving the failure of a project failing to come to fruition or, even worse, bailing on their promise altogether.

Most Kickstarters that run past deadlines aren’t necessarily doing it because they’re lazy or incompetent. For many people seeking funding, this is a whole new territory. Even when you have the funding, getting a physical product into production is a vulnerable process where any number of things can go wrong. I have a friend who likes to back Kickstarter projects such as games and collectibles; he’ll usually assume he needs to add a month or two extra to their end goal for his expectancy of delivery. In his experience, the more money a project gets, the more likely delays. When a backer asks for $100,000 and they instead receive $8 million, that’s a whole lotta extra product to plan for. Delays are expected at that point. In this case, the responsible crowdfunding entrepreneur should provide regular updates of both the good and the bad. This kind honesty in the manufacturing process will generate understanding and lenience from the backers.

Once the backer accepts that reasonable delays are a part of the logistical minefield that is crowdfunding, the patience can be rewarding. And for the most part, the transparency provided by a public platform is enough to dissuade anyone from using the services for dishonest personal gain. People know when they’re being lied to, especially when you have their money. And if not enough people back a project to meet the goals that have been set then no money is taken and the pledges are cancelled.

Crowdfunding is definitely here to stay. Looking at Kickstarter’s official numbers is pretty interesting. Since 2009 the service has helped raise $1.6 billion in funding. $1.36b of that was successful, although that was at a 38.83% success rate. The majority of the successful dollars are coming from projects that are actually worth taking a look at. But for every successful Pebble/Exploding Kittens/Star Citizen, there’s a whole lotta crap like this, thus skewing the success percentage a little lower. Like other social outlets for creativity, such as YouTube, the cream rises to the top and 99% of the rest is total garbage.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps money suitcase crowdfunding Kickstarter Indiegogo potato salad kickstarter

And then there’s the tasty, tasty 1%.

The idea of buying a concept is starting to show up in other kinds of industry. Electric smartcar manufacturers such as Tesla will be able to send software updates to cars, similar to your iPhone. Camera manufacturer Blackmagic Design has a new model of 4K camera called the URSA that allows the user to swap out the image sensor when a new once becomes available, allowing for better image quality without having to buy a whole new camera in a few years.

As long as the temptation to release half-finished products is avoided, this business model will prove to be extremely fruitful. By releasing a product that is high-quality and upgradable, the customer has less stress over having to buy something over and over again whenever updates and changes are made. The relationship to the company is also personalized as there is a now a direct line to the consume;  and smaller companies are getting a chance jump in and prove their mettle.

USB-C And The Challenge of Change

March 11th 2015 in Blog

The first patent for an electrical clothes iron was filed by Henry Seely of New York City in 1881. Besides the numerous safety concerns that needed to be solved before bringing it to market, there was also the issue of even being able to use the device in homes. At the time most electrical use in homes was for light sources and some of the first electric appliances were powered by connecting them to lightbulb sockets.

Seely’s device wasn’t too far ahead of its time. Patents for more familiar plug and socket systems started appearing in Europe in 1883. But it wasn’t until the mid-1920s that an electric steam iron became commercially available as a consumer product. In fact, none of Seely’s original devices exist to this day. His invention is often considered a great example of something just too far ahead of its time for its own good. But in tech innovation, being ahead of the times is what gets things going.

Upon debut in 1998, the iMac was the first consumer computer to ship sans a 3.5 inch floppy disk drive. In addition to this deliberately bold absence, the iMac also made use of USB ports in place of classic “legacy ports” such as PS/2, VGA, and serial.

legacy_ports

These bastards.

Here’s how everything in the world works;

If you want something to become better you have to take the steps to get there. If want to run a marathon, you have to train and be disciplined and manage your dietary intake and do all kinds of things that make your body hurt while it becomes stronger. If you’re trying to get the hotel in a game of Monopoly, then you know that you have to buy four green houses before getting to that sweet, sweet red (unless you play like a barbarian with bank loans and Free Parking money).

Like anything worth doing, you can’t just whip up the best computer in the world and be done with it. Getting there is a long and gradual process of trial, failure, and occaisional bouts of innovation. As long as computers are a main hub for productivity, learning, and general everyday tasks, we will never reach a specific end goal of Best Computer Ever. It’s a potentially infinite road that is paved by some of the smartest people on the planet.

Critics, and the public at large, are skeptical of sudden change in computing standards. That’s understandable, because it usually requires to buying more things. Things that aren’t usually cheap. But at the same time, everyone wants their computers and electronics to be faster and more efficient.

Consumer products are made by large business that need to make money to survive. As long as people buy what the sell, they can keep doing what they do. The most idiotic complaint directed at the tech world is “why would update that model? I just bought one last year! I’m so mad!” People assume they got ripped off when a product is updated too soon after they buy one.

But any company worth its salt won’t let a product line stagnate and wither. Smart, successful companies are always working to improve the latest versions of their goods. That means tinkering with it until it’s ready for a big update. Keep in mind that in the computing world, a lot of things get built from scratch. Extensive testing and returns to the drawing board are not uncommon. This can take a while, and it makes sense to push the current product as far as it can go in the meantime. And when it comes time to replace yours, you will find a newly enriched product; often with a similar price tag, often with more powerful guts.

This week, Apple caused an interesting ruckus with the reintroduction of the MacBook, minus the “Air” and “Pro” descriptors. It’s a pretty piece of machinery that’s stupid thin and boasts Apple’s trademark minimalism. But is it too minimalist? There are only two ports on the device. A standard 3.5mm headphone jack and a multi-purpose USB-C port.

As more manufacturers include USB-C in their products, it’s poised to become an industry standard. Unlike Apple’s other cables such as Lightening and Thunderbolt (developed with Intel and used by other manufacturers as well), USB-C is a true catch-all cable. As you’ve probably guessed by the lack of other ports on the new MacBook, USB-C is a power cable as well. It’s also a high-speed data transfer and AV I/O cable. And it’s very thin – almost as thin as the Lightning charging cable.

Whether you like Apple or not, you have admit that they are respected as tastemakers of the tech world. The iMac’s bold design choices set a personal computing standard that has evolved across many companies’ products to this day. iTunes stared down physical media in entertainment while cracked its knuckles and muttering, “I’m coming for you, man.” Smartphones were around before the iPhone, but no one could get one right until Apple banished the physical keyboards that were holding the ideas back.

The pattern is consistent. Apple takes a good idea, strips away any of the unnecessary crap, and makes it more efficient. By innovating with the purpose of removing unnecessary junk, Apple has consistently changed the way we transfer data from our various devices. And with wireless charging soon poised to make a debut in personal computers, one data port is all you’ll need.

Sometimes you have to be crazy enough to release a product a little before its time. And for now, the other Macs still have their ports; Apple is playing it smart by not tinkering with any of their current product line, but instead resurrecting a dead one to experiment with.

And you if you don’t like it, don’t buy it.

Apples and Pebbles: Comparing Smartwatches

March 4th 2015 in Blog, Technology

Kickstarter has produced some whacky projects that have really taken off. Everything from card games about exploding kittens to some dude who got a serious case of the munchies has been successfully funded. It’s become its own Shark Tank reality show in a way; for every good idea, there’s a ton of stuff that would make even the most jaded SkyMall customer blush.

In 2012, startup Pebble broke the Kickstarter monetary record for its original run of smartwatches with over $10 million in pledges from over 68,000 people. (Their target was $100,000). Pebble became a hit and has now shipped over 1 million watches. Now Pebble is back on Kickstarter with an updated version of the watch called Pebble Time. In just a week they have rocketed past their goal of $500,000 with over $15 million, once again responsible for the most funded project on Kickstarter. I haven’t had a change to play with one in person, but the Pebble does seem to be a pretty cool little gizmo, especially for the price point.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps pebble smartwatch apple watch smartwatch smartwatches

And totally not Apple Watches.

The Pebble is definitely a grassroots hit; and with an almost too simple look, complete with e-ink display, it’s arguably more minimalistic and devoid of bells and whistles than the forthcoming Apple Watch from the company that most prides itself on sleekness. The Pebble Time is already being called the Apple Watch killer, despite the fact that neither device has been released yet. A huge deal for both is battery life. The Pebble Time is boasting a rather impressive 7 days between charges while the Apple Watch is going to have to be recharged every night, assuming it makes it through the entire day (because – let’s be honest here – people are going to try and watch YouTube videos on it and be disappointed when that doesn’t work out so well).

I’m willing to give both a chance. At this point, they both suffer from the biggest con of all – they’re both smartwatches. I’ve seen like two smartwatches in person, ever. At least one was just a cheap LG or something. And while the Pebble doesn’t give me the same skin-crawling “calculator watch” feel in terms of design that pretty much every other smartwatch does, it still looks like something you might get in a Happy Meal. The Apple Watch, on the other end of the design spectrum, looks nice but still feels a little too flashy. Apple is known for making products that declare themselves to the room in their smug “too cool for school” Jonny Ive way, but you also haven’t attached one to your wrist yet. It would almost be worth waiting a year to get one just so that every time I try and make a point in a conversation my watch doesn’t become the topic. (I’m not Italian, but when I’m deep into an animated talk, my hands definitely are).

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps pebble smartwatch apple watch smartwatch smartwatches

“Hi! My watch is better than you.”

People like everything divided up into neat categories of black and white, right or wrong. In the realm of mindless fun, like sports, it makes sense. X sports team is obviously better than Y sports team because they’re my favorite! When you let this mindset loose into actual real life, (like say, making politics exclusively Democrat vs. Republican) it frankly makes you look like an idiot. I’m a fan of Apple products, so when a dyed-in-the-wool Windows or Linux user tries to bait me into a fight about “which team is better,” my response is kind of a shrug of “everyone needs a different product for different things.” This was especially confusing to people who would try and pick computer fights when I worked at an Apple store.

People are obsessed with the idea of “better,” and most of the time it comes from an insecurity that they’ve made the wrong choice somehow. They need validation that where they’ve put their money is indeed The Best. But here’s the thing. It doesn’t actually matter. If you like it, and it does what you need it to do, then for you it is The Best. Who gives a damn what everyone else thinks? I have better things to do than argue with someone that Apple makes better computers than everyone else. For me and my needs, that is true. For others, that may not be. Same goes for Android vs. iPhone, Pebble vs. Apple Watch, and so on.

Pebble is probably not going to dip into the Apple Watch sales any more than the Apple Watch will dip into the Pebble’s sales. As far as I can tell, they’re angling for different kinds of consumers. And that’s great. A healthy marketplace will always have room for rivals. Not only does it give consumers more options, but it keeps each company on their toes as far as constantly innovating. Comparing them at a certain point is all about personal preferences and it gets pretty nitpicky. Don’t get too worked up about what the other guy is using. Just choose what works for you and have fun with it.

AI Part II: The First Threat

February 25th 2015 in Blog, Technology

Every generation has that guy standing on a street corner with a big ratty DOOMSDAY sign. Sometimes, like in the recent Cold War era, everyone else joins him. For a while, terrorism was Fear’s employee of the month, but that hot streak has cooled by now. It’s still around, but its desk got stuck in the basement with Milton from Office Space. There’s a lot of fringe-sounding ideas getting talked about in mainstream media these days since big tech moguls (which tend to be a pretty eccentric bunch) are getting a lot of face time and becoming household names. One of those fringe ideas is artificial intelligence.

A few well-known moguls who have expressed rational levels of concern without being too much of a buzzkill about it; Bill Gates and Elon Musk are both interested in AI development, but only if it is done with great thought. Stephen Hawking is less thrilled by the idea and has said that AI will spell the end of the human race.

AI’s most tangible threat is currently unfolding in low-level jobs where it is already being implemented to replace human employees.

While jobs that require a very specific and trained set of skills are often aided by AI, they still employ highly trained individuals. Less glamorous jobs, such as those in factories and manufacturers have already started to replace humans with automated robots. More and more, any job that doesn’t allow for thinking on the fly or outside the box is going to be targeted. The nature of the work being targeted for replacement is usually repetitive and robotic, such as that in manufacturing facilities. The average, everyday job seems to be safe – for now. According to Bill Gates, as artificial intelligence develops, it will start to edge out middle class employees as well.

But after that warning, Gates hypothesizes a refreshingly optimistic solution. He says that while concerns arising from the replacement of people by machines are valid, this is a chance to improve education and help people survive displacement by forthcoming robotic workers. After the recent financial crisis, economic downturn and job loss is more of a relatable and worrisome threat to the current working generations than the vague always-impending physical destruction of the world that was prevalent in the Cold War (and which is usually the bread and butter of AI doomsayers). If AI does pose any threat in the immediate future, it is in socioeconomic areas such as this.

Foxconn is a massive Chinese electronics manufacturing corporation that is notorious for the slavish treatment of their 1.2 million workers. Conditions are harsh and hours are inhumanely long, forcing many employees to live in crowded factory dorms just they have time to sleep, much less afford a place to do so. In 2010, 14 workers committed suicide by jumping off of factory rooftops. Foxconn’s response was the installation of suicide nets. Many of the companies using Foxconn’s services have turned the heat up in regards to the treatment of workers, including Apple (which is currently bringing manufacturing back to the United States – they opened a facility in Houston in 2013 and another is being planned in Arizona).

In 2012, Foxconn invested in 10,000 robotic arms that would replace workers at $25,000 a pop, with 20,000 more on the way. The plan has started to backfire already as the robotic arms are not precise enough for many of their products, requiring them to hire back as many as 100,000 workers.

Disposable treatment like this says a lot about the business environments that are driving a demand for workers to be little more than robots. Companies like Foxconn are just as ruthlessly efficient and demeaning to humans as any potential human-hating AI singularity would be. Bill Gates is right. If AI software is advancing fast enough to shuffle people around like a pile of computers, then the best way to prevent any AI-related troubles in the job field is in creating better opportunities, training, and education. AI is just a complex tool. And for the foreseeable future, the most dangerous thing will be the people wielding it. And if those people continue to treat other people in this way, any future AI they create will be just the same.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps artificial intelligence intelligence explosion uncanny valley AI atomic anxiety jobs foxconn bill gates terminator

“I learned by watching you!”

AI Part I: Introduction to the Uncanny Valley

February 18th 2015 in Blog

No conversation about the future is complete without a discussion of artificial intelligence (AI). There are quite a few misconceptions about what AI actually is and everyone has a slightly different way of approaching it. Usually opinions are at one extreme or the other; either gloatingly dismissive or obsessively paranoid. The rational response probably lies somewhere in between.

Here’s a more rational approach.

AI is not a living thing and it never will be. It’s exactly what the name implies; a simulation of intelligence that can give the illusion of being a sentient being. But no matter what, anything and everything it does is a result of complex programming. But when it gives the illusion of being a living thing and it does it so convincingly, what kinds of effects will that have on people?

There’s a concept in robotics (and it applies to other simulations as well, such as computer generated imagery – abbreviated as CGI – in films) called the “uncanny valley.”  The idea of the uncanny valley, a concept put forward by roboticist Masahiro Mori in 1970, posits that our brains can recognize when something is “off” in a fellow human. If someone is lying to you, you may believe them, but if they have a subtle “tell” your brain can pick that up and process it without you fully realizing it. Presumably, you are used to truthfulness in your interactions with others. Even the subtlest lie can give a sensation of unease.

As simulations both virtual (CG) and physical (robotics) continue to advance and become more realistic, our brains accept what we’re seeing as obviously fake. But when say, a CG human in a movie is 99.9% realistic, the upward curve collapses into that uncanny valley. The uncanny valley is the point where every single tiny little thing that’s “off” jumps out at us. Simply put, a simulation of life, like a robot, will continue to become realistic and relatable until it is so close to being a hundred percent indistunguishble from the real thing that the tiniest details jump out at you, often causing discomfort. You can watch a CG animation like a Pixar movie with traditionally exaggerated cartoonish humans and not feel creeped out. Your brain doesn’t pick up the flaws of the blue cat people in Avatar because you know that blue cat people aren’t real and your brain isn’t wired to recognize them. But when you watch something attempting photorealism, your alarms go off. You may not be able to say why (although it’s usually described as soulless dead eyes).

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps artificial intelligence intelligence explosion uncanny valley AI atomic anxiety Polar Express dead eyes Robert Zemeckis Tom Hanks

Creepy Stoner Child never really caught on as a Christmas tradition.

Let’s say that AI does become self-aware and reaches the theoretical point that is called the singularity (more on that in the next part). Let’s say that a machine becomes self-aware; it knows it’s a machine, but that can’t change anything as far as its own existential problems. It’s still just a machine. If it looks like a trashcan, it’ll be easier to accept. But if it looks like a regular person, you might have trouble rationalizing to yourself, this is a walking iPod. Animators know this trick; that’s why everything from Disney princesses to Avatar cat people to Wall-E have massive eyes. It instantly makes the character empathetic (and sells tons of toys). Even cars have “faces.” It’s why in PG action films, bad guys are usually masked. It’s easier to accept that the heroes are killing hundreds of other humans if we can’t see the faces they’re snuffing the life from.

I’ve talked to people who don’t care about any of this at all. An AI-powered robot is just a glorified computer. Therefore, any concerns are moot and why should we care? I agree with this, but I would disagree that there shouldn’t any concern at all. Any unchecked advancement in a field of science will lead to trouble.

Innovation And The New Space Age

February 11th 2015 in Blog

They always say that hindsight is 20/20. This usually seems to apply to the speculation that is traditionally presented in the science fiction genre. To some degree, at least. There’s so much being explored in the realm of futuristic fiction that it’s difficult to see what’s going to stick until it actually happens. Jules Verne’s submarine was far-fetched at the time, but a few decades later, U-boats were crucial aspects of modern warfare. At the same time, other topics in science fiction – self-aware artificially intelligent machines in the work of Isaac Asimov and space colonization in that of Arthur C. Clarke for instance – have remained in a firmer speculative category.

Until now. For a good portion of the 20th century, this kind of space-age futuristic advancement was dependent on government programs. Indeed, much of the innovation that we’ve seen has come about as a result of arms-racing, or the PR arm-wrestle that was the Space Race. Once there was no need for America to beat the USSR in getting to the Moon first, space travel became less of a government interest, leading NASA into a state of decay and relegating astronauts to low Earth orbit research missions. This following statement is both fascinating and frustrating: we went to the moon and came back safely before we had internet and cell phones. And we haven’t been back since.

What happened?

Innovative events like the Moon landing came from a need to reassure the people of America – at a time when political propaganda was widely accepted and less contested – that we were the best and we could beat anyone. Once we did, there was no reason to keep going further. The goal had been set and met. This is the same thing that happens when someone starts a company with the attitude of “I’ll sell it in two years for a gazillion dollars.” Those companies don’t generally fare quite so well as the ones that are started from the attitude of “let’s make a positive impact and, yeah, we’ll be sure to make some money on the way, too.”

Great innovation and true ground-breaking discovery comes from a maverick attitude that is quite prevalent in the startup world. You’ve got an industry that is home to a few self-made billionaires (the kind who drop out of Ivy League schools to start Facebook or Microsoft) who generally have an anti-authoritarian attitude (remember, it takes a special kind of insanity to be this different) but often a compassionate side for humanity as a whole.

Technological leaps aren’t as easy as banging out a few prototypes and Kickstarting the rest. Thomas Edison famously went through 10,000 lightbulbs before he found the most viable solution. Your iPhone is a very affordable luxury; but it took millions of dollars to reach your pocket. And unless you are a big company that can dump tons of cash into what will likely end up being skunkworks, there’s not a lot of immediate incentive to push things beyond the current status quo. This is where renegade billionaires come into play. Self-made entrepreneurs like Richard Branson or Elon Musk or Bill Gates don’t necessarily need to see a return on every single thing (although it’s still good business to do so) right away. But because these are all people who see beyond the short-term advantages (immediate profits, for one) they can give complex ideas the time necessary to come to full fruition. They have the resources and patience to nurse something until it can spread its own wings. As an example, Musk’s Tesla Motors reported its first profitable quarter in 2013 – ten years after opening up shop. Patience pays off, especially when you can afford it. On top of that, they are interested in making the world a better place after they’ve passed on. Pushing for huge industry changes like electric over gasoline vehicles reflect that.

That’s why we are seeing a second renaissance in complicated and expensive industries like spaceflight, for instance. Right now guys like Elon Musk are saying what NASA was saying when I was a kid; that we’ll have someone on Mars in the next 15 years or so. Except this time, I believe the person saying it. I have no doubt that humans will have landed on Mars by 2020. NASA has been limping along as an underfunded government arm for several decades now. With no uber-patriotic cause needed to unite America the drive isn’t there at a government level. But SpaceX and Virgin Galactic don’t need to ask for the permission of politicians to spend taxpayer’s money on “useless” science playtime.

The kinds of innovations that are being made now aren’t feats that can be accomplished in a garage. While those will always be around, you can’t get to space with spare car parts and popsicle sticks. Just like the Age of Discovery five centuries ago, the Age of Space will be funded by private corporations. And just like those in the Age of Discovery realized, the feasibility of a round trip means that any progress will be naturally impeded by the effort and time wasted on returning. Just as New World colonization was a one-way ticket, so too will be missions to far-off places like Mars.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps space age innovation spacex nasa tesla Elon Musk Richard Branson Virgin Galactic

Mobile Apps in 2015: Some Numbers

February 4th 2015 in Blog

Here at Wovax we get really excited about tech in general and where its innovations are taking the world. This week though, we’re going to revisit some of our own territory – but it is still exciting and shows what’s coming. Or rather, what’s already arrived and how it will keep growing.

Let’s look at how mobile usage has grown recently.

Something we like to revisit from time to time is growth in the area of mobile usage. Apple closed out 2014 with a massively record-making $18 billion quarter (in profit) thanks to last fall’s iPhones. Considering they barely sold more iPhones than Android phones than were sold in the same time frame, that’s a lot of devices for both sides of the mobile camp. It’s not too surprising, given that 80% of internet users own smartphones.

Currently, there are 1.75 billion smartphones being used worldwide, and that’s at the current rate of growth. As internet continues to become less of a first-world luxury and more of a standard commodity similar to a car or lightbulb, those numbers will continue to explode.

Each piece of the mobile web pie is growing every year, and one of the pieces that is especially important to the business sector is online shopping. According to IBM, online sales went up nearly 14% during the 2014 holiday season as compared to the previous year. This makes sense as holidays are a busy time for everyone and people will be on the go more than usual (which, nowadays, is quite often). People will be traveling and out of town and when travel efficiency is a concern, mobile devices generally win out; even over a laptop.

Pulling back from November/December to get the full 2014 perspective, mobile app usage grew by 76% in 2014 with lifestyle and shopping apps leading the way at a whopping 174% growth. For years now, studies have been predicting that 2015 will be a crucial mark in the point where mobile web rules internet traffic. As we can see by current studies and the spiking growth of app-based mobile traffic and shopping, we can see that this is absolutely the case.

As we’ve been growing our client base, we have discovered that many businesses and organizations – from restaurants to blogs to school districts to real estate – have been quickly realizing the obvious benefits in staking out a mobile presence. The efficiency and reach of a mobile app is a main attractor. They all have specific needs and are able to use the flexibility that an app provides in meeting those needs. For example, a school will generally place high priority on being able to reach parents, teachers, and students for announcements or more serious matters like an emergency. Real estate is a no-brainer as it allows people to peruse listings on the go; while they house-shop, for instance.

Running a website represents a large part of many livelihoods. With the numbers and data from the last half-decade accurately reflecting precisely what is happening now and where the traffic is coming from, it would be stupid to delay a mobile app presence any longer.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps 2015 mobile shopping stats mobile app stats mobile stats

Probably not this stupid, but close.

Why Did No One Care About NFC Until Apple Did?

January 28th 2015 in Apple, Blog, Technology

One of the most talked about aspects of last fall’s new iPhone lineup was the Apple Pay feature. Using contactless NFC (near field communication) technology built into your phone, this software allows for making payment without a physical credit card. And by using your fingerprint instead of an easily stolen PIN, verification is more secure. Staying true to the form of the best technology streamlining simple tasks to be even simpler and aiding in the decluttering of our lives, Apple Pay was a hit.

Much of the appeal for the upcoming Apple Watch comes from it featuring NFC as well, allowing you to pay with a swipe of the wrist.

Critics were quick to point out something; Apple didn’t invent NFC technology. In fact, it’s been present in phones for quite a while. Nokia started introducing NFC in 2006, and Samsung has been using it in their phones for several years now.

Words like “steal” get tossed around quite a bit in tech these days. Sure, tech companies are always at each other’s throats in bitter court battles, but most of that is a formality. But the heart of a thriving market has always been in competition, and that will only come about when companies try to out-do each other. For the most part, this will work out well for everyone involved. In trying to beat the competitor, a company will push itself beyond what it thought was possible and ideally create the best product it can. To that end, customers will end up with a selection of top-notch products or services that they can choose from.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app steve jobs ibm finger apple pay NFC

Pictured: some healthy competition.

Apple didn’t invent the computer. Instead, they made it more accessible to the general public. They didn’t invent the MP3 player, but instead used the classic “razor and blades” approach with the iPod and iTunes store respectively. Apple didn’t invent the smartphone, but they once again made it accessible and approachable to the average consumer. These innovations were so successful that in 2007 the company changed its name from Apple Computer, Inc. to a simpler Apple, Inc., thus signifying a shift to consumer electronics in a broader sense.

This isn’t a “let us now bow down and worship Steve Jobs” article. But as it stands, Apple is currently the world’s most valuable brand. Even if you hate them, you have to admit that they are clearly doing something right. So what is it?

The two things that Apple has gotten ridiculously right are branding and environment. I’ve talked before about the anecdotes I collected working the sales counter in an Apple retailer and I’m still amused by the amount of people who would complain to me about how much they hated Apple because of how expensive it is, or because Steve Jobs was a jerk, Apple is about to start losing money, etc, etc. Joke’s on them though, because they still came in to buy a $2000 Facebook machine. Oh, and Apple just had the most profitable quarter of any company in history. As in, they blew past the previous record quarter which was held by a natural gas company. Right after “everyone” said their gargantuan iPhones would fail. But negative critics are often the loudest. Most people are happy with the products that Apple offers. Those kinds of numbers don’t lie.

One of the biggest things they have gotten right is branding. Their logo doesn’t say “Apple” anywhere. And yet, you see it and just know. Apple. It’s a McDonalds/Nike/Starbucks level of recognition. That’s impressive. The reason for this is the overall branding of Apple’s products as not just a computer or phone, but also as a lifestyle choice. An Apple product is more than a chunky plastic workhorse. I’ve been in many – I use this term lovingly! – snobby people’s homes that look like something from a magazine or Pinterest board and have often seen a large iMac front and center in the living room, frequently in place of a TV. These are well-designed products and pleasant to look at, not things that get tucked away in a home office. People like to be seen with an Apple product. The pride in the design on both the part of the company and their customers is a large reason for the company’s success and the Apple logo represents that.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app apple pay NFC original apple logo 1976 newton

Well, it does now.

The other aspect of Apple that has given them a solid edge over their competition is the environment they created. Apple has their own ecosystem from the moment you buy the computer to every time you use it to rent a movie. Their products all work together seamlessly and require no third parties. The strength of Apple’s ecosystem extends beyond the retail aspect of retail Apple Stores and iTunes. It is inherent in the software as well. Both the desktop software (OSX) and the mobile software (iOS) are built in-house at Apple and in collaboration with the people who design the products. This gives a certain level of seamlessness that no one else can boast of. Android for instance has to work on many devices made by many different companies. This means that glitches and bugs are going to be more common. Many complain of Apple’s closed off ecosystem, but it certainly serves a purpose in functionality.

So why did no one seem to care about NFC until Apple jumped into the game? Because many other tech companies have a history of putting out half-baked prototypes that don’t make enough of an impact to stick around. Apple has a reputation for taking its time and putting out a product that will integrate easily with the rest of its carefully constructed environment.

Is this fair to other companies? Of course it is. Companies like Samsung, Microsoft, and Google are successful in their own way. They’re all worth billions of dollars as well. They’ve all found their own niche and taken it. But Apple’s niche happens to be the one with the widest reach: taking a complicated tech product that others haven’t quite nailed down and making it accessible and understandable to everyone.

Digital Identity Is Becoming Important

January 21st 2015 in Blog

Science fiction is one of the most important and interesting genres of storytelling that we have. Its strength lies in the ability to be a little more on-the-nose than normal and make its own obviousness easier to swallow by transposing the story to a future world that somewhat resembles our own. This isn’t us, but it could be. This isn’t actually how oppressive our world is right now – or is it? We’re not actually living in a simulation – or are we? A common theme found in sci-fi is the idea that people are nothing more than numbers in a giant system that lets them exist so that it can exploit them. Barcodes to be scanned, numbers to be punched.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app office science fiction dystopian workers cubicles digital identity

Science fiction.

One very sci-fi sounding idea that already exists is the idea of digital identity. Your digital identity can be defined as who you are online. Everything posted about you, both by yourself and by others, whether it’s true or not. Every brand, movie, band, TV show, restaurant, or sports team page that you like on Facebook or follow on Twitter.  If someone (such a potential employer) Googles you, what will they find? Whatever they find is your digital identity. It’s your reputation and interests made tangible. But if you’re plugged into the internet at all, it can be hard to control.

The current generation (mine) is growing up with a unique dilemma that no one else has yet faced. Social interactions have been generally limited to direct in-person communication and if something was written down it was either intended to be read by only one or two other people or else it was supposed to be seen by many people. There wasn’t a lot of in-between on that. Nowadays, it’s easy to let the line blur between private and public. When you post on your buddy’s Facebook wall that you wanna hang out, you’re essentially standing in a room full of everyone you’ve ever met and shouting across everyone else to your friend. Not only that, but you know all the dumb little conversations your average day has? If you’re typing it, it’s stored away somewhere, theoretically forever. It’s a weird way to communicate.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app megaphone yelling digital identity

“You wanna hang out later?”

This leads to interesting side effects. Assuming you behave in mostly the same way online that you do off, it’s remarkably easy for third parties to figure out who you are. Not in a “we know your name and where you live” kind of way but more of a “we know what you like to eat for lunch and what you’re going to buy next and who you have a crush on.” This isn’t as sinister as it sounds; for the most part, they just want you to give them your money. And honestly, if I’m going to look at ads, I’d rather see ads for things I might actually want to buy.

The inconvenient and more invasive side of this has recently come into play with the way many potential employers have treated social media. Recently, employers started demanding that job candidates hand over login credentials to their Facebook accounts as part of the interview process. While this was swiftly made illegal in many states, it doesn’t stop them from looking at what is publicly accessible. And while there’s nothing wrong with the HR department taking a look at your Twitter, having a social media presence is so normal these days that not having one is also considered – perhaps unfairly – a red flag by many. The assumption is that people who choose not to share their lives on Facebook or Twitter are hiding something. This will likely be a passing fad as older generation bosses who aren’t quite sure how to handle new digital trends are replaced by millennials. But for now, digital identities are taking root and becoming akin to something like a credit score; you need to have it to be allowed to do certain things, but it should also be managed wisely.