Wireless Charging and the Future of Batteries

April 15th 2015 in Blog, Technology

As devices become smaller and more powerful, one of the things that has been taking a hit is the battery life. Apple for example has enjoyed success and invited scrutiny with their latest iPhone models and their Watch respectively. What we have is great, but it’s being pushed about as far as it can as electronics become thirstier that ever. Even the new MacBook had to sacrifice space for computing power to fit what ended up being a multi-compartmentalized battery, rendering it even less powerful than a MacBook Air. The quest for thinner is now at the impasse of choosing between sleek looks and computing muscle.

Some of the most interesting battery innovation right isn’t coming from trying to squish something into a smartwatch. It’s in the electric car industry. Tesla, the company that made electric cars both practical and cool, is working to bring an enhanced model of their car batteries into homes. They’ll probably be announcing them at the end of the month. Combining these batteries with solar power will be a boon to anyone trying to cut costs on their electric bill. That’s pretty cool, but how does that help your smartphone battery?

Let’s take a step back for a second ask ourselves, what if we applied the same thinking to power sources that we do to other areas of computing? Take cloud technology for instance. At this point, it’s ubiquitous. Wireless networks are omnipresent and it’s rarely inconvenient to rely on them. Computers are increasingly just windows to the troves of information that we store elsewhere.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps wireless charging tesla elon musk home batteries Interstellar black hole Matthew  Mcconaughey

This one stores love.

 

Now let’s take these building-powering batteries a step further with wireless charging. No more dangerous sockets for kids to zap themselves with. Whether you’re at the office or a coffee shop, your seating options won’t be limited by a dying laptop battery. You’ll even notice a difference with the extra freedom you’ll gain in arranging your furniture where you want. Oh, and guess what? Wireless charging is totally real. Starbucks has been testing wireless charging mats in their stores, IKEA is working it into their products, and numerous products such as the Vessyl come with wireless pads that can charge the device when placed on top.

So, yeah. Wireless charging! How does this affect your device’s battery life? The most likely way seems to be similar to how your smartphone’s internet access works. To get faster speeds that don’t munch into your data plan (and to save precious battery life) you probably use a wi-fi connection whenever one is available. Wireless charging would work the same way. Your devices, as long as they are in range of a battery that is broadcasting power, will be connected and charging. Once they are out of range, they revert to the onboard battery. And wireless charging is safe; no more dangerous than a cell phone.

Wireless charging is likely to first take off with electric cars as well. Tesla knows that plugging in a car is a monotonous task that will become easier and easier to forget. Think of how many times you forgot to charge your phone before bed. Now imagine that you ride your phone to work. Elon Musk, Tesla’s CEO (and 007 villain in the making) is experimenting with robotic snake plugs that attach themselves to your car, but you can bet they’ll go wireless as soon as they can.

Amazon Dash: Why Simple Isn’t Always Efficient

April 8th 2015 in Blog

This will be an unusual article for me to write. I’m a pretty excitable guy when it comes to new technology and such. I like interesting stuff and seeing all the new and efficient ways of doing things that people are coming up with. Thus, we present the Amazon Dash; a new concept so simple it becomes convoluted again and essentially negates itself. Amazon Dash is a plastic button that is brand-specific. You press the button and preset quantity of said thing appears on your doorstep within a day (it’s by innovation only for Amazon Prime users). It uses a very low-powered wi-fi chip and is small, if loudly branded. Users receive a notification their phone when the button is pressed and have a 30 minute time window to cancel an order. As many of  the products offered target people who have young children or pets, the button has a measure in place that only allows one order to be processed at a time (so pressing it fifty times in a row will only order your preset amount once).

It’s actually so weird and hilarious that everyone understandably thought was an early April Fool’s Day gag product (it was announced on March 31st). But it’s not. It’s a real product made by the company that perfected large-scale commerce and also wants to bring drones into play as a delivery system. Dash actually started its odd life cycle as a handheld wand-like device that let you scan products as you ran out of them. It featured voice recognition as well, presumably so you could finally fulfill your dream of ordering Kraft products in the most magical way possible.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps Amazon Dash button food ordering Harry Potter

“Must…replenish…Velveeta!”

The idea behind Dash is seemingly simple. But why go to all that trouble of making zillions of little physical buttons if you can just make an app that does the same thing? This is where it starts to get needlessly convoluted for consumers. Right now there are about 270 products available for use with Dash (most of them are trash bags, diapers, and granola bars). Do you really want dozens of little buttons scattered around your house for all the things you want to have replenished on a regular basis? Or, do you want to have to decide what three items (the current button limit) are worth having buttons for? That’s a lot of overthinking for something so supposedly simple. Dash is likely not the final form of whatever product is at the end of the tunnel. One of the newest category that the Internet of Things has created is “smart appliances.” Refrigerators that order food when running low, coffee makers that order more filters or beans, washing machines that order more detergent…that sort of thing. That is a bit less convoluted than an individual button for everything but the bigger question looming behind all of this seems to be “is going to the store really that big of an inconvenience?”

I don’t think the Dash necessarily represents a bold new step in the temple of laziness that many bemoan the modern world has become, but the concept is something ponder. Big brands are going to love the tighter bond between product and consumer that Dash buttons or smart appliances create. Instead of constantly being confronted by fifty choices when you go shopping and weighing your options every now and again, the order-it-for-you technology encourages you to make a choice once and theoretically never again. Digital music and movies and even books are one thing; the physical presentation of those products is arguably less important as long as the delivery of that content doesn’t impede on the integrity of the audience experience. At the risk of sounding a little “get off my damn lawn,” do we really want to live in a world where you can theoretically go through life without actually interacting with other people in a direct physical sense, much less leaving your house? The reason allergies were uncommon before the 20th century is because people didn’t live in sealed airtight houses. As bummerific as it may be to consider, for many people going to the store may be the only time they get out and about in a non-work environment.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps Amazon Dash button food ordering Wall-E fat people sci-fi spaceship

But we’ll all go to space eventually and then everything will be okay!

The Dash isn’t a dumb product. It’s intuitive enough; I could foresee a future where a button with a logo on it is naturally assumed to be a “get more of this” device. Is it efficient? For the end user, it can be. But in the long run, Dash drives against the current zeitgeist in the acquiring of foodstuffs. The Dash in its current state seems to be a sevice people in the late ’90s would have dreamed of using the internet for. In a new millennium that prides itself on being aware of where your food is coming from and farmer’s markets are on the rise? Not so much.

Periscope: The Most Important App Release In Years

April 1st 2015 in Blog

Twitter’s newest app, Periscope, is one of the the most important apps that has ever come out. If you have a smartphone and you haven’t tried it out, you need to get yourself to the App Store and tap the download button. Right now. There’s also Meerkat, but everyone seems to be on Periscope. They aren’t paying me to write this. I am genuinely very, very excited for the new territory being opened by these apps.

So what do they do and why are they so exciting? These are apps that stream live video from a smartphone. Connecting to Twitter lets you send out a link to your live broadcasts. If you have a smartphone you have your own little TV network. I popped open Periscope while I was walking around town the other day and in less than a minute I had 40 people from all over world watching. (I currently live in an Idaho town called Moscow, so most of them were confused Russians).

What’s so great about it then? For average everyday use, it’ll just be a fun diversion because most everyday tasks are boring. If you’ve ever wanted to watch people eating their food instead of Instagramming it and you have gross friends then you’ll be in luck. But creativity will bring new ways of using our smartphones to connect with world around us. One of my writer friends texted me the day it came out, “I’m trying to figure out how to use Periscope for storytelling.”

One of the first ways we’ll see Periscope used for storytelling will be in journalism. Periscope’s first day in the wild saw it being used to cover an incident in New York City that ended with three buildings collapsing and 25 injuries. Hundreds of people were watching across several feeds from Persicopers in the area.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps Periscope meerkat twitter Ferguson social justice journalism Twitter

The first news story covered on Periscope.

When people started using weblogs (colloquially shortened to “blog”), they were usually regarded as a form of recreational writing. But in 2004, bloggers gained respect – and fear – from the press when they exposed CBS news anchor Dan Rather’s use of falsified documents relating to then-president George W. Bush’s previous service in the military.

Last year, most of the honest reporting from places like Ferguson wasn’t coming from CNN or Fox News; it was coming from journalists who were live-tweeting from the streets. Imagine a world where everyone can broadcast their own live, unfiltered views of events as they unfold. Periscope isn’t for journalists. It’s for everyone. It turns any smartphone user into a journalist. People like you and me. Anyone who wants the world to see things from their point of view. That world is here now and we will soon see it aid us in bringing greater transparency to many pressing issues.

The Charitable Side of Technology

March 25th 2015 in Blog

One of my favorite authors in high school was Michael Crichton (like it was for many people, Jurassic Park was my gateway drug). I’ve always appreciated Crichton’s stories because he never swayed to one extreme or the other in exploring the implications of cutting edge ideas. As a scientist himself, he understood and respect the importance of innovation, discovery, and learning. But he also knew that the entire field is Pandora’s Box. Once something has been unleashed on the world, you can never take it back. That’s why criticism of a work like Jurassic Park as being impossible misses the entire point of that work. The story isn’t actually about cloning dinosaurs. The cloned dinosaurs in the story are a metaphor for the potential consequences of unchecked advancement and discovery fueled by ego in place of goodwill. But rather than condemn advancement and discovery, he understood that it was like any other force or object. Depending on who wields it and why, it can be a tool or a weapon.

"Genetic power is the most awesome force the planet's ever seen, but you wield it like a kid that's found his dad's gun." - Jurassic Park, written by Michael Crichton and David Koepp

“Genetic power is the most awesome force the planet’s ever seen, but you wield it like a kid that’s found his dad’s gun.” – Jurassic Park (1993). Written by Michael Crichton and David Koepp, directed by Steven Spielberg

Possibly the greatest asset that technology has is to be charitable. And currently many notable advancements being made in the tech world are those that spring from a charitable mindset. Once the “wow” factor and cost go down, there are many ways that new inventions can improve the quality of life for millions, and especially for the less fortunate. That’s why we now live in a time where ailments such as blindness and deafness can often be cured through the use of relatively inexpensive implants. The guys who cashed in on the early days of computing and internet advancement – Elon Musk, Richard Branson, Jeff Bezos, and Bill Gates are a few prominent examples – have turned their attention towards cultivating advancements that aren’t going to yield immediate results but instead lay the groundwork for future generations.

Bill Gates was recently asked in an online Q&A about the use of technology to extend lifespans. He pointed out that it is selfish for people like him to research in that area while there are still diseases that have been wiped out from most first world countries that still ravage third-world ones. And Gates has been willing to put his money where his mouth is; since leaving his leadership positions at Microsoft, he has put the same effort he did into advancing home computing into helping people in developing nations. One of the most important projects he is funding is new sanitation solutions that not only keep homes and streets clean, but also make efficient use of the waste energy solutions and resource recycling.

Which

Which is a nice of saying “it makes water for you to drink from poop.” Check it out, though; it’s actually fantastic.

Crowdfunding has shown that a large group of people from all over the world can each provide a small piece to create something huge. The e-NABLE group is a community originated by Jon Schull, a professor at the Rochester Institute of Technology that connects people throughout the world who are in need of prosthetic limbs with people who can use their 3D printers to help design and build low cost alternative. Children especially are often deprived a substitute because not only are full blown robotic limbs prohibitively expensive, they will be outgrown multiple times until the child reaches the end of their growth period. E-NABLE’s alternatives are around $50 for a hand and $150 for a limb and can be customized and replaced part by part for almost no money as the child continues to grow.

Don’t be afraid of technology. Instead, be wary of those who use it exclusively for personal gain and support those who use it to improve the lives of others.

 

Buying a Concept: The Rise of Crowdfunding

March 18th 2015 in Blog

The thing that makes the Internet great is that it breaks down the gatekeepers in a variety of industries. In fact, it doesn’t even know they exist anymore; they’ve been circumvented and don’t usually realize, or admit it, until it’s too late. Making a product for commercial mass-production is a huge challenge. Unlike a many other endeavors, like writing a business plan or a novel, there’s overhead inherent in creating a prototype to even pitch. Generally speaking, the people who are going to be investing big bucks are going to be older. They often may not have the interest or knowledge in something too unusual or tech-oriented.

Enter crowdfunding. Crowdfunding has always been a great way to get a decent amount of funding with little commitment from backers. A great example is filmmaker Darren Aronofsky soliciting donations of $100 each from friends and family to get the $60,000 he needed for his first movie Pi.  The film made over $3 million and his funders all received back a promised sum of $150, plus a credit on the film. He is now considered one of the best contemporary auteurs of cinema and his last two movies, Black Swan and Noah, made over $300 million each.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps Noah ark darren aronofsky

There’s an idea; let’s crowdfund one of these suckers.

Of course, not everyone has people hanging out nearby who can fork over $100 just like that. And even that project took five years to fully fund. What’s great about crowdfunding when it becomes a more streamlined process is that you are essentially pre-ordering a product. And putting quite a bit of trust that someone you’ve never met is going to be able to deliver.

There’s also the added benefit of accountability. While Kickstarter does their best to screen projects, they makes it clear that you are responsible for how you fund with your own money. Even with the added benefit of crowdfunding through a legitimate service like this, there have been a few high-profile snafus involving the failure of a project failing to come to fruition or, even worse, bailing on their promise altogether.

Most Kickstarters that run past deadlines aren’t necessarily doing it because they’re lazy or incompetent. For many people seeking funding, this is a whole new territory. Even when you have the funding, getting a physical product into production is a vulnerable process where any number of things can go wrong. I have a friend who likes to back Kickstarter projects such as games and collectibles; he’ll usually assume he needs to add a month or two extra to their end goal for his expectancy of delivery. In his experience, the more money a project gets, the more likely delays. When a backer asks for $100,000 and they instead receive $8 million, that’s a whole lotta extra product to plan for. Delays are expected at that point. In this case, the responsible crowdfunding entrepreneur should provide regular updates of both the good and the bad. This kind honesty in the manufacturing process will generate understanding and lenience from the backers.

Once the backer accepts that reasonable delays are a part of the logistical minefield that is crowdfunding, the patience can be rewarding. And for the most part, the transparency provided by a public platform is enough to dissuade anyone from using the services for dishonest personal gain. People know when they’re being lied to, especially when you have their money. And if not enough people back a project to meet the goals that have been set then no money is taken and the pledges are cancelled.

Crowdfunding is definitely here to stay. Looking at Kickstarter’s official numbers is pretty interesting. Since 2009 the service has helped raise $1.6 billion in funding. $1.36b of that was successful, although that was at a 38.83% success rate. The majority of the successful dollars are coming from projects that are actually worth taking a look at. But for every successful Pebble/Exploding Kittens/Star Citizen, there’s a whole lotta crap like this, thus skewing the success percentage a little lower. Like other social outlets for creativity, such as YouTube, the cream rises to the top and 99% of the rest is total garbage.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps money suitcase crowdfunding Kickstarter Indiegogo potato salad kickstarter

And then there’s the tasty, tasty 1%.

The idea of buying a concept is starting to show up in other kinds of industry. Electric smartcar manufacturers such as Tesla will be able to send software updates to cars, similar to your iPhone. Camera manufacturer Blackmagic Design has a new model of 4K camera called the URSA that allows the user to swap out the image sensor when a new once becomes available, allowing for better image quality without having to buy a whole new camera in a few years.

As long as the temptation to release half-finished products is avoided, this business model will prove to be extremely fruitful. By releasing a product that is high-quality and upgradable, the customer has less stress over having to buy something over and over again whenever updates and changes are made. The relationship to the company is also personalized as there is a now a direct line to the consume;  and smaller companies are getting a chance jump in and prove their mettle.

USB-C And The Challenge of Change

March 11th 2015 in Blog

The first patent for an electrical clothes iron was filed by Henry Seely of New York City in 1881. Besides the numerous safety concerns that needed to be solved before bringing it to market, there was also the issue of even being able to use the device in homes. At the time most electrical use in homes was for light sources and some of the first electric appliances were powered by connecting them to lightbulb sockets.

Seely’s device wasn’t too far ahead of its time. Patents for more familiar plug and socket systems started appearing in Europe in 1883. But it wasn’t until the mid-1920s that an electric steam iron became commercially available as a consumer product. In fact, none of Seely’s original devices exist to this day. His invention is often considered a great example of something just too far ahead of its time for its own good. But in tech innovation, being ahead of the times is what gets things going.

Upon debut in 1998, the iMac was the first consumer computer to ship sans a 3.5 inch floppy disk drive. In addition to this deliberately bold absence, the iMac also made use of USB ports in place of classic “legacy ports” such as PS/2, VGA, and serial.

legacy_ports

These bastards.

Here’s how everything in the world works;

If you want something to become better you have to take the steps to get there. If want to run a marathon, you have to train and be disciplined and manage your dietary intake and do all kinds of things that make your body hurt while it becomes stronger. If you’re trying to get the hotel in a game of Monopoly, then you know that you have to buy four green houses before getting to that sweet, sweet red (unless you play like a barbarian with bank loans and Free Parking money).

Like anything worth doing, you can’t just whip up the best computer in the world and be done with it. Getting there is a long and gradual process of trial, failure, and occaisional bouts of innovation. As long as computers are a main hub for productivity, learning, and general everyday tasks, we will never reach a specific end goal of Best Computer Ever. It’s a potentially infinite road that is paved by some of the smartest people on the planet.

Critics, and the public at large, are skeptical of sudden change in computing standards. That’s understandable, because it usually requires to buying more things. Things that aren’t usually cheap. But at the same time, everyone wants their computers and electronics to be faster and more efficient.

Consumer products are made by large business that need to make money to survive. As long as people buy what the sell, they can keep doing what they do. The most idiotic complaint directed at the tech world is “why would update that model? I just bought one last year! I’m so mad!” People assume they got ripped off when a product is updated too soon after they buy one.

But any company worth its salt won’t let a product line stagnate and wither. Smart, successful companies are always working to improve the latest versions of their goods. That means tinkering with it until it’s ready for a big update. Keep in mind that in the computing world, a lot of things get built from scratch. Extensive testing and returns to the drawing board are not uncommon. This can take a while, and it makes sense to push the current product as far as it can go in the meantime. And when it comes time to replace yours, you will find a newly enriched product; often with a similar price tag, often with more powerful guts.

This week, Apple caused an interesting ruckus with the reintroduction of the MacBook, minus the “Air” and “Pro” descriptors. It’s a pretty piece of machinery that’s stupid thin and boasts Apple’s trademark minimalism. But is it too minimalist? There are only two ports on the device. A standard 3.5mm headphone jack and a multi-purpose USB-C port.

As more manufacturers include USB-C in their products, it’s poised to become an industry standard. Unlike Apple’s other cables such as Lightening and Thunderbolt (developed with Intel and used by other manufacturers as well), USB-C is a true catch-all cable. As you’ve probably guessed by the lack of other ports on the new MacBook, USB-C is a power cable as well. It’s also a high-speed data transfer and AV I/O cable. And it’s very thin – almost as thin as the Lightning charging cable.

Whether you like Apple or not, you have admit that they are respected as tastemakers of the tech world. The iMac’s bold design choices set a personal computing standard that has evolved across many companies’ products to this day. iTunes stared down physical media in entertainment while cracked its knuckles and muttering, “I’m coming for you, man.” Smartphones were around before the iPhone, but no one could get one right until Apple banished the physical keyboards that were holding the ideas back.

The pattern is consistent. Apple takes a good idea, strips away any of the unnecessary crap, and makes it more efficient. By innovating with the purpose of removing unnecessary junk, Apple has consistently changed the way we transfer data from our various devices. And with wireless charging soon poised to make a debut in personal computers, one data port is all you’ll need.

Sometimes you have to be crazy enough to release a product a little before its time. And for now, the other Macs still have their ports; Apple is playing it smart by not tinkering with any of their current product line, but instead resurrecting a dead one to experiment with.

And you if you don’t like it, don’t buy it.

Apples and Pebbles: Comparing Smartwatches

March 4th 2015 in Blog, Technology

Kickstarter has produced some whacky projects that have really taken off. Everything from card games about exploding kittens to some dude who got a serious case of the munchies has been successfully funded. It’s become its own Shark Tank reality show in a way; for every good idea, there’s a ton of stuff that would make even the most jaded SkyMall customer blush.

In 2012, startup Pebble broke the Kickstarter monetary record for its original run of smartwatches with over $10 million in pledges from over 68,000 people. (Their target was $100,000). Pebble became a hit and has now shipped over 1 million watches. Now Pebble is back on Kickstarter with an updated version of the watch called Pebble Time. In just a week they have rocketed past their goal of $500,000 with over $15 million, once again responsible for the most funded project on Kickstarter. I haven’t had a change to play with one in person, but the Pebble does seem to be a pretty cool little gizmo, especially for the price point.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps pebble smartwatch apple watch smartwatch smartwatches

And totally not Apple Watches.

The Pebble is definitely a grassroots hit; and with an almost too simple look, complete with e-ink display, it’s arguably more minimalistic and devoid of bells and whistles than the forthcoming Apple Watch from the company that most prides itself on sleekness. The Pebble Time is already being called the Apple Watch killer, despite the fact that neither device has been released yet. A huge deal for both is battery life. The Pebble Time is boasting a rather impressive 7 days between charges while the Apple Watch is going to have to be recharged every night, assuming it makes it through the entire day (because – let’s be honest here – people are going to try and watch YouTube videos on it and be disappointed when that doesn’t work out so well).

I’m willing to give both a chance. At this point, they both suffer from the biggest con of all – they’re both smartwatches. I’ve seen like two smartwatches in person, ever. At least one was just a cheap LG or something. And while the Pebble doesn’t give me the same skin-crawling “calculator watch” feel in terms of design that pretty much every other smartwatch does, it still looks like something you might get in a Happy Meal. The Apple Watch, on the other end of the design spectrum, looks nice but still feels a little too flashy. Apple is known for making products that declare themselves to the room in their smug “too cool for school” Jonny Ive way, but you also haven’t attached one to your wrist yet. It would almost be worth waiting a year to get one just so that every time I try and make a point in a conversation my watch doesn’t become the topic. (I’m not Italian, but when I’m deep into an animated talk, my hands definitely are).

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps pebble smartwatch apple watch smartwatch smartwatches

“Hi! My watch is better than you.”

People like everything divided up into neat categories of black and white, right or wrong. In the realm of mindless fun, like sports, it makes sense. X sports team is obviously better than Y sports team because they’re my favorite! When you let this mindset loose into actual real life, (like say, making politics exclusively Democrat vs. Republican) it frankly makes you look like an idiot. I’m a fan of Apple products, so when a dyed-in-the-wool Windows or Linux user tries to bait me into a fight about “which team is better,” my response is kind of a shrug of “everyone needs a different product for different things.” This was especially confusing to people who would try and pick computer fights when I worked at an Apple store.

People are obsessed with the idea of “better,” and most of the time it comes from an insecurity that they’ve made the wrong choice somehow. They need validation that where they’ve put their money is indeed The Best. But here’s the thing. It doesn’t actually matter. If you like it, and it does what you need it to do, then for you it is The Best. Who gives a damn what everyone else thinks? I have better things to do than argue with someone that Apple makes better computers than everyone else. For me and my needs, that is true. For others, that may not be. Same goes for Android vs. iPhone, Pebble vs. Apple Watch, and so on.

Pebble is probably not going to dip into the Apple Watch sales any more than the Apple Watch will dip into the Pebble’s sales. As far as I can tell, they’re angling for different kinds of consumers. And that’s great. A healthy marketplace will always have room for rivals. Not only does it give consumers more options, but it keeps each company on their toes as far as constantly innovating. Comparing them at a certain point is all about personal preferences and it gets pretty nitpicky. Don’t get too worked up about what the other guy is using. Just choose what works for you and have fun with it.

AI Part II: The First Threat

February 25th 2015 in Blog, Technology

Every generation has that guy standing on a street corner with a big ratty DOOMSDAY sign. Sometimes, like in the recent Cold War era, everyone else joins him. For a while, terrorism was Fear’s employee of the month, but that hot streak has cooled by now. It’s still around, but its desk got stuck in the basement with Milton from Office Space. There’s a lot of fringe-sounding ideas getting talked about in mainstream media these days since big tech moguls (which tend to be a pretty eccentric bunch) are getting a lot of face time and becoming household names. One of those fringe ideas is artificial intelligence.

A few well-known moguls who have expressed rational levels of concern without being too much of a buzzkill about it; Bill Gates and Elon Musk are both interested in AI development, but only if it is done with great thought. Stephen Hawking is less thrilled by the idea and has said that AI will spell the end of the human race.

AI’s most tangible threat is currently unfolding in low-level jobs where it is already being implemented to replace human employees.

While jobs that require a very specific and trained set of skills are often aided by AI, they still employ highly trained individuals. Less glamorous jobs, such as those in factories and manufacturers have already started to replace humans with automated robots. More and more, any job that doesn’t allow for thinking on the fly or outside the box is going to be targeted. The nature of the work being targeted for replacement is usually repetitive and robotic, such as that in manufacturing facilities. The average, everyday job seems to be safe – for now. According to Bill Gates, as artificial intelligence develops, it will start to edge out middle class employees as well.

But after that warning, Gates hypothesizes a refreshingly optimistic solution. He says that while concerns arising from the replacement of people by machines are valid, this is a chance to improve education and help people survive displacement by forthcoming robotic workers. After the recent financial crisis, economic downturn and job loss is more of a relatable and worrisome threat to the current working generations than the vague always-impending physical destruction of the world that was prevalent in the Cold War (and which is usually the bread and butter of AI doomsayers). If AI does pose any threat in the immediate future, it is in socioeconomic areas such as this.

Foxconn is a massive Chinese electronics manufacturing corporation that is notorious for the slavish treatment of their 1.2 million workers. Conditions are harsh and hours are inhumanely long, forcing many employees to live in crowded factory dorms just they have time to sleep, much less afford a place to do so. In 2010, 14 workers committed suicide by jumping off of factory rooftops. Foxconn’s response was the installation of suicide nets. Many of the companies using Foxconn’s services have turned the heat up in regards to the treatment of workers, including Apple (which is currently bringing manufacturing back to the United States – they opened a facility in Houston in 2013 and another is being planned in Arizona).

In 2012, Foxconn invested in 10,000 robotic arms that would replace workers at $25,000 a pop, with 20,000 more on the way. The plan has started to backfire already as the robotic arms are not precise enough for many of their products, requiring them to hire back as many as 100,000 workers.

Disposable treatment like this says a lot about the business environments that are driving a demand for workers to be little more than robots. Companies like Foxconn are just as ruthlessly efficient and demeaning to humans as any potential human-hating AI singularity would be. Bill Gates is right. If AI software is advancing fast enough to shuffle people around like a pile of computers, then the best way to prevent any AI-related troubles in the job field is in creating better opportunities, training, and education. AI is just a complex tool. And for the foreseeable future, the most dangerous thing will be the people wielding it. And if those people continue to treat other people in this way, any future AI they create will be just the same.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps artificial intelligence intelligence explosion uncanny valley AI atomic anxiety jobs foxconn bill gates terminator

“I learned by watching you!”

AI Part I: Introduction to the Uncanny Valley

February 18th 2015 in Blog

No conversation about the future is complete without a discussion of artificial intelligence (AI). There are quite a few misconceptions about what AI actually is and everyone has a slightly different way of approaching it. Usually opinions are at one extreme or the other; either gloatingly dismissive or obsessively paranoid. The rational response probably lies somewhere in between.

Here’s a more rational approach.

AI is not a living thing and it never will be. It’s exactly what the name implies; a simulation of intelligence that can give the illusion of being a sentient being. But no matter what, anything and everything it does is a result of complex programming. But when it gives the illusion of being a living thing and it does it so convincingly, what kinds of effects will that have on people?

There’s a concept in robotics (and it applies to other simulations as well, such as computer generated imagery – abbreviated as CGI – in films) called the “uncanny valley.”  The idea of the uncanny valley, a concept put forward by roboticist Masahiro Mori in 1970, posits that our brains can recognize when something is “off” in a fellow human. If someone is lying to you, you may believe them, but if they have a subtle “tell” your brain can pick that up and process it without you fully realizing it. Presumably, you are used to truthfulness in your interactions with others. Even the subtlest lie can give a sensation of unease.

As simulations both virtual (CG) and physical (robotics) continue to advance and become more realistic, our brains accept what we’re seeing as obviously fake. But when say, a CG human in a movie is 99.9% realistic, the upward curve collapses into that uncanny valley. The uncanny valley is the point where every single tiny little thing that’s “off” jumps out at us. Simply put, a simulation of life, like a robot, will continue to become realistic and relatable until it is so close to being a hundred percent indistunguishble from the real thing that the tiniest details jump out at you, often causing discomfort. You can watch a CG animation like a Pixar movie with traditionally exaggerated cartoonish humans and not feel creeped out. Your brain doesn’t pick up the flaws of the blue cat people in Avatar because you know that blue cat people aren’t real and your brain isn’t wired to recognize them. But when you watch something attempting photorealism, your alarms go off. You may not be able to say why (although it’s usually described as soulless dead eyes).

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps artificial intelligence intelligence explosion uncanny valley AI atomic anxiety Polar Express dead eyes Robert Zemeckis Tom Hanks

Creepy Stoner Child never really caught on as a Christmas tradition.

Let’s say that AI does become self-aware and reaches the theoretical point that is called the singularity (more on that in the next part). Let’s say that a machine becomes self-aware; it knows it’s a machine, but that can’t change anything as far as its own existential problems. It’s still just a machine. If it looks like a trashcan, it’ll be easier to accept. But if it looks like a regular person, you might have trouble rationalizing to yourself, this is a walking iPod. Animators know this trick; that’s why everything from Disney princesses to Avatar cat people to Wall-E have massive eyes. It instantly makes the character empathetic (and sells tons of toys). Even cars have “faces.” It’s why in PG action films, bad guys are usually masked. It’s easier to accept that the heroes are killing hundreds of other humans if we can’t see the faces they’re snuffing the life from.

I’ve talked to people who don’t care about any of this at all. An AI-powered robot is just a glorified computer. Therefore, any concerns are moot and why should we care? I agree with this, but I would disagree that there shouldn’t any concern at all. Any unchecked advancement in a field of science will lead to trouble.

Innovation And The New Space Age

February 11th 2015 in Blog

They always say that hindsight is 20/20. This usually seems to apply to the speculation that is traditionally presented in the science fiction genre. To some degree, at least. There’s so much being explored in the realm of futuristic fiction that it’s difficult to see what’s going to stick until it actually happens. Jules Verne’s submarine was far-fetched at the time, but a few decades later, U-boats were crucial aspects of modern warfare. At the same time, other topics in science fiction – self-aware artificially intelligent machines in the work of Isaac Asimov and space colonization in that of Arthur C. Clarke for instance – have remained in a firmer speculative category.

Until now. For a good portion of the 20th century, this kind of space-age futuristic advancement was dependent on government programs. Indeed, much of the innovation that we’ve seen has come about as a result of arms-racing, or the PR arm-wrestle that was the Space Race. Once there was no need for America to beat the USSR in getting to the Moon first, space travel became less of a government interest, leading NASA into a state of decay and relegating astronauts to low Earth orbit research missions. This following statement is both fascinating and frustrating: we went to the moon and came back safely before we had internet and cell phones. And we haven’t been back since.

What happened?

Innovative events like the Moon landing came from a need to reassure the people of America – at a time when political propaganda was widely accepted and less contested – that we were the best and we could beat anyone. Once we did, there was no reason to keep going further. The goal had been set and met. This is the same thing that happens when someone starts a company with the attitude of “I’ll sell it in two years for a gazillion dollars.” Those companies don’t generally fare quite so well as the ones that are started from the attitude of “let’s make a positive impact and, yeah, we’ll be sure to make some money on the way, too.”

Great innovation and true ground-breaking discovery comes from a maverick attitude that is quite prevalent in the startup world. You’ve got an industry that is home to a few self-made billionaires (the kind who drop out of Ivy League schools to start Facebook or Microsoft) who generally have an anti-authoritarian attitude (remember, it takes a special kind of insanity to be this different) but often a compassionate side for humanity as a whole.

Technological leaps aren’t as easy as banging out a few prototypes and Kickstarting the rest. Thomas Edison famously went through 10,000 lightbulbs before he found the most viable solution. Your iPhone is a very affordable luxury; but it took millions of dollars to reach your pocket. And unless you are a big company that can dump tons of cash into what will likely end up being skunkworks, there’s not a lot of immediate incentive to push things beyond the current status quo. This is where renegade billionaires come into play. Self-made entrepreneurs like Richard Branson or Elon Musk or Bill Gates don’t necessarily need to see a return on every single thing (although it’s still good business to do so) right away. But because these are all people who see beyond the short-term advantages (immediate profits, for one) they can give complex ideas the time necessary to come to full fruition. They have the resources and patience to nurse something until it can spread its own wings. As an example, Musk’s Tesla Motors reported its first profitable quarter in 2013 – ten years after opening up shop. Patience pays off, especially when you can afford it. On top of that, they are interested in making the world a better place after they’ve passed on. Pushing for huge industry changes like electric over gasoline vehicles reflect that.

That’s why we are seeing a second renaissance in complicated and expensive industries like spaceflight, for instance. Right now guys like Elon Musk are saying what NASA was saying when I was a kid; that we’ll have someone on Mars in the next 15 years or so. Except this time, I believe the person saying it. I have no doubt that humans will have landed on Mars by 2020. NASA has been limping along as an underfunded government arm for several decades now. With no uber-patriotic cause needed to unite America the drive isn’t there at a government level. But SpaceX and Virgin Galactic don’t need to ask for the permission of politicians to spend taxpayer’s money on “useless” science playtime.

The kinds of innovations that are being made now aren’t feats that can be accomplished in a garage. While those will always be around, you can’t get to space with spare car parts and popsicle sticks. Just like the Age of Discovery five centuries ago, the Age of Space will be funded by private corporations. And just like those in the Age of Discovery realized, the feasibility of a round trip means that any progress will be naturally impeded by the effort and time wasted on returning. Just as New World colonization was a one-way ticket, so too will be missions to far-off places like Mars.

wovax wordpress mobile app iOS mobile app android mobile app wordpress mobile app native wordpress app mobile apps space age innovation spacex nasa tesla Elon Musk Richard Branson Virgin Galactic