Scattered Thoughts on Amazon’s Workplace Culture

The New York Times published an in-depth article on the business practices and corporate culture at Amazon this past Saturday. The piece, by Jodi Kantor and David Streitfeld, shows a company attempting to take over the (retail) world by any (legal) means necessary. From hiring practices to employee reviews, Amazon relies on principles that seem intuitive from the outside but sound absolutely destructive to those employed by the web giant.

The old mantra was that a company’s most valuable resource was its employees. Amazon digs deeper than the employee – it’s the employee’s labor that drives the machine. In a data-driven or data-dependent (depending on your view) society, efficiency is the golden goose, the star in the sky that guides its followers. The equation posits that more efficient employees bring in more revenue for the company, which theoretically drives profits because operating and labor costs are flat. Amazon has basically refined workplace culture to a basic equation and has looked at making the slope as big as possible. If its employees are four times as efficient as a rival company’s, Amazon will make four times as much money at the same cost.

But it’s not the same “cost”. Amazon gets people to buy in for a variety of reasons, but few (if any) of them are altruistic or completely selfish. Amazon isn’t really in the business of helping those who actually need help – its basic goal is to be The Everything Store (there’s even a book by the same name!). The company wisely bet that customer service really doesn’t matter unless it’s absolutely terrible, people want their stuff as cheap and quickly as possible. Amazon rarely gets beat on price and now offers one-hour delivery in certain cities across the country. No other company of its size can compete with that.

So Amazon’s customers are happy, but its employees are miserable. The article is damning in its swaths of evidence – it could’ve been twice as long with testimonials of battered former employees finally seeing sunlight for the first time in ages. If labor is the fungible portion of the equation, its malleability ruins peoples’ lives. Employees are pushed and pushed to give more and more, eventually cracking under the pressure, the stress and the impossibility of it all, but their value has been harvested, so they’re sent out to pasture so a new younger, better employee can be milked for his or her potential. Amazon is a factory farm of talent, who wouldn’t want to work for the “largest (online) retailer in the world”? If you’re not working for yourself or for a better cause, Amazon might be the pinnacle of achievement. And this is so incredibly sad.

Amazon has prestige, so it can attract top talent. These people get indoctrinated into Amazon’s famous “culture”, which is probably akin to 21st century Marxism. Once they buy in, it’s over. They want to be better than everyone else and tell people at parties they work for Amazon and tweet out how great the company is for all the cool “perks” they offer. It’s all bullshit – every major company has guiding principles and benefits and thinks they’re the best. No one should need to “buy in” to be accepted and valued and treated with respect.

How does Amazon find and evaluate its employees? Data. From the article:

“Data creates a lot of clarity around decision-making,” said Sean Boyle, who runs the finance division of Amazon Web Services and was permitted by the company to speak. “Data is incredibly liberating.”

Take the quote and its context first. One of the people in charge of making Amazon money by using the internet as a virtual lock box likes that he has access to troves and troves of data. Figuring out how to sequence and manipulate and connect data together in order to package a service that generates revenue is literally this man’s job. He’s not terrible at it – Amazon Web Services is likely one of the company’s most profitable divisions and will scale to an unforeseen level in time due to its inevitable demand.

However, data analytics is at best a mixed bag and at worst a terrifying crutch. Boyle is correct in saying that data creates clarity around decision-making, but it also can eliminate the necessary human component. Incorporating data into a bigger picture can be extremely useful; relying mostly or solely on data all but eliminates any wiggle room or character considerations. Those who are the most efficient or best at their jobs may not be the best employees for Amazon’s future, but a data-dependent approach will gloss over the supposed mediocre in favor of the outliers.

Proponents of Amazon’s approach will cite its one major advantage: parity. No one is judged by their appearance, their age, their gender or political leanings – just their work. It’s utopian, Darwinian (an idea echoed by current and former employees in the piece). But everyone plays at some kind of disadvantage, whether it’s their home life, their manager, the work itself, an illness, a lack of motivation, etcetera. You can’t modify that component in the equation, there’s no 50% bonus for “working sick” that can be applied to an employee’s utility. The numbers are left to speak for themselves. Again, it should be a component, but by no means does it level the playing field. Instead, it strips away a company’s heart until it becomes nonexistent.

A common theme in the article was survival. It was like the reporters found everyone they talked to at self-help group for former Amazon employees, each sharing their horror stories as therapy. This is what Amazon leaves in its wake, burnt out people who thought they’d be joining an amazing company only to find out that it looks a lot better from the outside. This article will cause some changes, but Amazon has even greater aspirations than being the biggest kid on the ever-growing block. To reach its lofty goals, further sacrifices will be made and most of the people feeling those pinches likely won’t be around to reap the rewards.


Visual Media’s Infinite Loop

There’s a seismic shift in televised entertainment, one so big it’s impossible to miss. People are watching less and less TV, or at least TV administered through a cable box and displayed onto a large HD screen. As appetites shift and convenience trumps content, people have begun looking at their cable subscriptions as ancillary. While internet — which was finally declared a utility late last year — has become extremely important, its bundled cousin has fallen by the wayside.

  The chart here, courtesy of the suddenly click-baiting Wall Street Journal, tells us what advertising and media executives already know – younger people aren’t watching “traditional” TV. They’re still watching content, but it’s rarely live and rarely on an actual television set. Television networks make their money by selling commercial time, with the most-watched shows generating the most advertising revenue because advertisers want their products to be seen by the most people. This is why a 30 second commercial sets a company back more than $3 million.

But if people aren’t watching TV, advertisers won’t pay to display their products on TV. And if that happens, networks are going to run out of money. So they’ve looked at ways of monetizing the content people are actually watching. This is why YouTube has those ads before most videos and podcasts are usually sponsored by at least one company.

This is also why sports are one of the last bastions of advertising dollars. People rarely watch sports on delay or even days later because the result will likely be spoiled by the time they tune in. Live audiences are almost forced into watching commercials — turn the channel and you may miss the key play in the game. Television networks know this, but so do professional sports teams, which own the regional broadcast rights to their games and have the power to sell them for whatever they can get. This is why Time Warner paid more than $7 billion (yes, billion with a B) to air 25 years of Los Angeles Dodgers games two years ago. That deal, arguably a disaster, still guarantees that the Dodgers can spend $280 million on player salaries annually with the money they’re getting from Time Warner alone and still break even. To put that in perspective, the team itself was sold for “only” $2 billion in 2012.

Changing viewing habits threaten to eventually eliminate those types of deals, potentially bringing down the cable bundle altogether. Networks have begun preparing for this, releasing stand-alone apps that bring their content to tablets, phones and computers. Want to watch last night’s The Daily Show? There’s an app for that. Want to watch a Yankees vs. Red Sox game? There’s an app for that (provided you live outside both New York and Boston, and the game isn’t broadcast nationally, and you have a paid subscription, but still). Want to watch Vince Vaughn’s eyes slowly die onscreen in True Detective? There’s an app for that.

Right now, all of these apps have some kind of restriction, whether it’s monetary, location or time. Professional sports leagues require a large monetary investment and even then can be subject to blackouts. Free apps, such as Comedy Central’s, post shows the following day and still have advertisements. For apps such as HBO NOW or Netflix that charge a monthly subscription, there’s the bang-for-your-buck fear. If there’s not enough content — in terms of quality, quantity or both — people will cancel their subscriptions. If AMC functioned like HBO and they suddenly lost The Walking Dead, would the company go under within a year?

Acquiring and creating this level of content costs a substantial amount of money, which means that those monthly fees need to be adequately high. Netflix has already openly talked about raising subscriber fees and others may feel the need to as well. While sports leagues like the current model a lot, league executives know that they’ll likely have to remove their trusted blackout policies. Opening such a door would lead to increased consumer revenue, but would piss off TV providers and network owners, some of which own teams themselves (hello, conflict of interest!).

The irony is that on a granular level, consumer tastes won’t really change that much. Yes, the platforms for content are rapidly shifting, but people who want to watch a lot of TV will end up subscribing to HBO, Netflix, maybe a couple of basic TV and basic cable apps (which will look a lot like CBS’s app) and maybe ESPN. Together, those apps might run upwards of $100 and would only provide the bare minimum acceptable to that individual. If they want more, they’ll likely wish an internet bundle existed.

Obviously, the shift in content and its platforms will be unpredictable, but advertisers will certainly find their a way to connect with their target audiences. The leaders of TV were those that got there first (NBC, CBS, ABC, FOX) and those that built the most passionate audience (ESPN, Fox News). It seems likely that if the internet were ever to look like television does today, those same types of leaders would be best positioned to rule. If we want bundled content, Google and Facebook will likely figure out a way to provide it for us. It’ll be the same story just with different people pulling the strings and connecting the tubes.


Indiana’s RFRA and My Religious Upbringing

I was raised Catholic. This is as much an admission as it is a badge of pride, an emblematic ideal of a society that doesn’t actually exist. Both of my parents were raised Catholic as well, or at least they told me they were. I went to Catholic school from first through eighth grade, was an altar boy for four years and even worked as a substitute teacher in a Catholic school in 2013. Remember kneeling as a child on those padded benches? It’s so much worse as an adult. I now know why my teachers always sat down during those parts and quietly yelled at us when we complained.

Looking back, I’m extremely proud and grateful of my education. I was bestowed with exceptional educators (for the most part, at least), a solid community foundation and made friends I still have today. The historical aspects of The Bible or religion class never titillated me, so I can’t say for certain that this particular educational experience was more beneficial to any one else’s. However, I do know that the values instilled in me during these formative years were and are extremely important.

In Catholic school I learned how to be a better person, how to put others before me. I learned how to extend to everyone the simplest forms of respect and to expect nothing in return. My parents played a very important role — in both sending me to Catholic school and raising my siblings and me in a manner that would exemplify the lessons I learned in class — and for that I am extremely grateful.

I learned about Jesus, his life, his death and most importantly his beliefs. I learned about the Golden Rule, a motto I follow to this day, one which was poorly corrupted and manipulated recently. I was taught that we should love one another, that everyone is equal and that compassion and peace should trump greed and terror.

Basically, everything I learned in Catholic school is a lie.

My education does not reflect what popular Christianity represents today. They somehow seem to be polar opposites, even if their decrees are rooted in the same beliefs. I cannot fathom how people who preach the teachings of Christianity can be so openly bigoted, hateful and homophobic, because I was never taught to exhibit any of those emotions. Maybe it was to shelter my classmates and me from the harsh truths of the “real world”, but no one ever taught me that Jesus hated gay people or that they belong in hell. Shit, you supposedly go to hell for saying “Jesus Christ” outside of a licensed or sponsored prayer. If that’s the case, I’ll see everyone there!

Christianity taught me to be inclusionary, to lend a helping hand when I could, to be responsible for my actions because consequences were plentiful. My religion, my faith, basically taught me morality.

Religion hasn’t abandoned me as much as I have cast it aside. Church visits became infrequent as  as weekend activities and teenage sleeping habits took their place.  It wasn’t truly necessary anymore, its importance faded like week-old cold medicine. The morals and code of conduct stuck, I wasn’t suddenly going to revert into an insolent asshole because I stopped going to mass. Its importance hasn’t wavered in my life, but my need for affirmation and guidance predictably waned. I’m an adult now, I understand the difference between right and wrong.

Indiana’s RFRA law is wrong. It was written by homophobic individuals who wanted to hide their bigotry under the shroud of religion. Their evil ploy has worked, those who support the law feel that their religious freedom has been persecuted and ripped from them. We’re still fighting the same “oppression” the Puritans fought before America became the United States, the only difference now is that these same people are the ones who believe the US is the best country in the world. Land of the free, but only if you’re Christian, white and straight. Oh, and also a male, that’s the most important part.

Like a lot of individuals, I stopped caring about my faith as I became more educated. I learned what people had done in the name of their religion and felt shame, for something I thought was so peaceful was, in fact, so devastating. Faith and greed are the two main sources of evil on this planet because they remove rationality from the equation. Why is Christianity better than Hinduism? It’s not, they’re just different and it’s completely acceptable to be different and have different viewpoints…as long as those viewpoints don’t directly harm anyone. RFRA harms people by making them feel that they’re wrong for being different. They’re still men and women, they still want give their patronage to those deemed worthy to receive it.

At its most basic, religion is a crutch that puts a barrier between death and finality. Christianity has heaven and heaven sounds fantastic. Everyone’s supposedly up somewhere in the sky just relaxing and eating pizza and watching Walter Payton catch passes from Johnny Unitas on a perfectly green field in a stadium that will fit anyone who wants to attend. Who wouldn’t want to believe that exists? As a society, we don’t do well with acceptance. We can’t accept something is our fault — it must be someone else’s. We can’t accept that we’re falling behind in math and science as a country — people from other countries want to come here. We can’t accept that there’s a good chance heaven doesn’t exist — but what is the meaning of life if not a test to weed out the bad people from the afterlife? Christianity — and in truth most of religion — is based on a faulty concept and preached to the masses because it sounds better than the alternative. But if your fundamental belief has no basis in reality, you’re choosing to be ignorant much more than someone is “choosing” to be gay.

It’s painful to accept that I live in a country that openly discriminates against women, ethnic minorities, homosexuals and transgender individuals. But I accept it because I’ve seen what religion can do. It can be used for good and teach acceptance and reciprocity and kindness, or it can be used to instill shame and anger and cowardliness. This law was made by cowards for cowards who need to hide behind the bullshit that is religious persecution in order to legally be “weirded out” that someone they are prejudiced against might actually want to give them money for their good or service. I was raised in a Catholic home and a Catholic school and never have I felt further from the church. If this is what Christianity truly stands for, then I don’t want to be associated with it ever again.



Resolutions Past and Present

New Years resolutions carry the same weight as drunken promises. They’re meaningful in the moment, full of hope and excitement, but ring hollow shortly thereafter. For the same reasons that people like the number 100, New Years resolutions are common improvement tactics that only require the ideas. They’re startups without the capital to actually do anything, percolating thoughts with almost limitless potential.

Making a New Years resolution obviously takes much less effort than following through. Once it’s in the ether, it exists, a tangible ball of kinetic energy barreling toward success or failure. Most reach failure, but there’s those select few that somehow find a way to succeed. I’ve made many resolutions, carrying a success rate that resembles the Chicago Cubs’ World Series wins since 1900. Some were achieved and since they’re so rare they still hold weight years later.

In each of the past three years, I’ve written down my New Years resolutions in the hopes that they’d be committed to memory and then somehow achieved, like resolution by osmosis. Two years ago, I wanted to write the types of things for which a person is remembered. I also wanted Scrooge McDuck’s pool of money. None of these things happened.

Last year, I carried over the same goals — no reason to stop dreaming now — but included more attainable and emotional “promises” to myself. I wanted to eat healthier. I wanted to feel better about the choices I’ve made. I wanted to know that twelve months later I could become a better person.

It doesn’t matter if any of my resolutions came to fruition (a couple may have), what matters is that I wanted to make them. I wanted to hold myself accountable, but in a way that wouldn’t be punished. It’s a hollow gesture, sure, but a promise without repercussions is the same thing. I opened myself up to the idea that there are parts of me that should be changed, so why shouldn’t I try to change them now? This type of failure just means I’m the same person, there’s no added physical or emotional damage. Undertaking a task that’s likely to fail has a ring of earnest stupidity, but it’s those moments that illuminate. They educate. They’re something to hold onto when it’s difficult to persevere. And when they succeed, they help ease the pain of all the other failures.

I wrote a list of more than twenty New Years resolutions for 2015, which has been folded up and will remain in a dresser drawer for the next twelve months. I haven’t necessarily committed them all to memory, but I know what I was feeling when I wrote them. I know what I wanted to accomplish, what I wanted to feel. Some will be extremely difficult — giving up one of my favorite foods, for example — and others may not truly be possible. Attempting to be happier seems like a valiant resolution, but how can it be proven? Can it be proven? I certainly don’t know the outcome now, but I’m excited to progress toward its possibility of success. I can’t wait to try and prove myself right after so many failed resolutions that proved me wrong.

It’s the opportunity that I’m chasing, a chance to change (for the better hopefully). My dreams are in the ether now, all I can do is try my hardest to achieve them.

Marcus Lattimore’s Dream Deferred

I had been on the fence about paying college athletes, but mostly for selfish reasons. I paid — with the help of my parents and currently life-defeating student loans — for my four year education at a public university. Each monthly payment feels slightly worse, a stained reminder of wasted potential and missed opportunities.

But I fully support paying college athletes now.

The one caveat is that no one truly knows the best model to use. Certain universities leverage the success of their football and/or basketball programs to help pay for other sports, sports which may be important to some in the student body, but don’t bring in much revenue. Does a sliding scale work? The Olympic model? Regardless, at some point in the future college athletes will be paid. And it may be at least in part due to of Marcus Lattimore.

Lattimore signed with his home-state South Carolina Gamecocks in 2010 after being the ESPN National High School Junior in 2008 and an All-American the following year. He was a game — and school — changing recruit, a player who was supposed to fundamentally alter the trajectory of the Gamecocks for years to come. One of the many factors that goes into a recruit’s decision to attend a university is that school’s recent success; winning begets winning. Lattimore’s decision to play for South Carolina would not only help head coach Steve Spurrier win in the immediate future, but it would likely also help him recruit even better talent in order to sustain that success.

Lattimore made an immediate impact, helping the Gamecocks reach the SEC Championship game in his freshman season. He earned NCAA Freshman of the Year honors, as well as second team All American and first team All SEC awards. A year after playing in the bowl, South Carolina made the 2010 Chick-fil-A Bowl as the #19th ranked team in the nation.

They were even better the following year, winning the 2012 Capital One Bowl over the Nebraska Cornhuskers and finishing the year as the #9th ranked team in Division I. But they won without Lattimore, who had torn a knee ligament in October and missed the remainder of the season. Despite only playing half the year, Lattimore sill earned second team All SEC honors and set the single-game rushing record in addition to becoming the university’s all-time rushing touchdown leader. But unfortunately, the injury wouldn’t be his last.

Lattimore returned the following season to post impressive, but not unworldly numbers. However, he dislocated his knee in late October and would again miss the remainder of the season. In his three seasons with the Gamecocks, Lattimore rushed for 2,667 yards and 38 touchdowns, tied for 168th all-time with eleven other rushers. He joined South Carolina as a can’t-miss, program-invigorating recruit, but left it as a mid-round draft prospect with injury concerns. Lattimore may have given his best years to South Carolina, with little to show for his hard work in return.

It’s difficult for most people to realize the idea of someone reaching their peak potential so early in their lives. For office workers, the top of the bell curve lands approximately between 35-45 years old, but it’s quite different for most athletes. The average NFL player’s career spans less than four years, or the length of a rookie contract. The most recent CBA significantly reduced a new player’s salary, while an artificial age/college service time minimum means players must suffer through the rigors of unpaid labor — up to eight competitive years — before even getting a chance to earn a (league minimum) salary.

There are some logical reasons for such limitations, but under the current economic environment, college players are wasting valuable earning potential while their coaches and athletic director can rake in the cash. After his freshman season, Marcus Lattimore likely would’ve been a first round pick in the 2011 NFL Draft. Instead, he was a 4th round pick in 2013. He signed a contract with the San Francisco 49ers that would pay him $2.5 million over four years. According to, the South Carolina Gamecocks likely netted more than $4.5 million from their most recent Capital One bowl game appearance.

Eventually college athletes will be paid. Between the various lawsuits in progress and those that will surely be filed, the NCAA’s “slavery” model will crumble. College football’s next iteration may like quite different or could be exactly the same, as the talent base may consolidate further in the South, and along with it will go the capital. The NFL could lift its restrictions or change its rules, but that seems unlikely in the short-term, or really even the long-term; a partnership with the NCAA is beneficial for most of the parties involved, except the college athletes.

Education is used as a defense of the current model, but it has eroded over time. The new scandal involving past players at the University of North Carolina exposed the harsh truth that a college degree serves as more of a barrier than an incentive. Since grades are the primary evaluation of academic success, students will do whatever it takes to reach that benchmark. For athletes, grades are an obstacle to overcome, sports are their most important measure of excellence.

If athletes like Lattimore are helping bring universities millions of dollars — between performance payouts (bowl games), television contracts, endowments — in addition to marketing those universities for future recruits, they deserve to see a portion of that revenue. Lattimore sacrificed millions — if not tens of millions — of dollars because of NFL bylaws and the NCAA’s greed. Future players shouldn’t have to worry about having enough money to eat, especially when the commissioner of the SEC, Mike Silve, pocketed more than $1 million for his work in 2012.

It’s a long way from fair pay for hard labor, but eventually players like Marcus Lattimore won’t have to suffer for free.