Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This was a much better article than I was anticipating. (A lot of Apple criticism is either shallow or vitriolic, I guess a side effect of inspiring so much fervor on either end of the spectrum.)

My new favorite anecdote of Apple's deficiencies in the services side of things is how the App Store and iTunes Connect was just down for at least half a day a couple weeks back, with no visibility except a press release:

http://thenextweb.com/apple/2015/03/11/the-apple-app-store-i...

Or back in late January, where iTunes Connect had a bug that just randomly logged you in as a different user:

http://9to5mac.com/2015/01/29/widespread-itunes-connect-issu...

It's hard for me to imagine another company for which either of these incidents wouldn't inspire a round of warranted criticism -- especially given their bad services track record.

Still, it makes sense that Apple would continue to deprioritize developer relations (either implicitly or explicitly.) Put frankly, we're a tiny sliver of their consumer base: while the App Store might be raking in billions from that 30% its still more valuable to the company as a method of selling devices than as a revenue stream in its own right.



>Still, it makes sense that Apple would continue to deprioritize developer relations (either implicitly or explicitly.)

Yes. I think it also makes sense to remind ourselves of how Apple and Microsoft started in the 1970s. They have very different origin stories.

When Bill Gates and Paul Allen were burning the midnight oil for their first successful Microsoft product, it was a BASIC interpreter for the Altair computer. It was a product for programmers. Yes, Microsoft went on to dominate with operating systems (first DOS, then Windows) but they have "programmers" within their DNA. The CEO of the company (Bill Gates) was an ex-programmer and that perspective cascaded all the way down as direct and indirect decisions in what they did. People mocked Steve Ballmer's stage antics about "developers developers developers" but Microsoft really did pay attention to programmers like no other major corporation. The high praise of Visual Studio (in comparison to other IDEs such as Xcode, Eclipse, etc) is one testament to this. So are dev friendly connections with their internal programmers such as active blog posts and channel9.msdn.com featuring informal whiteboard coding sessions.

Contrast with Apple. Steve Wozniak and Steve Jobs were burning the midnight oil assembling the Apple I to sell to consumers. Wozniak was a programmer but the coding was incidental to making the Apple computer work. (E.g. his triumphant story of programming the floppy disk controller to work correctly.) They were selling end-user hardware and not a programmer's product like Microsoft. In 2011, none of the top 3 guys (Steve Jobs, Tim Cook, Jony Ive) were programmers. Craig Federighi is an ex-programmer but Apple doesn't seem like the kind of company that would hand the reigns over to him if Tim Cook stepped down. Craig is a powerful figure at Apple but obviously, he doesn't have the clout that Bill Gates did at Microsoft. The way Apple originated in 1976 has had a ripple effect all the way to the present day in 2015. It guided their org structure and their attitudes towards 3rd-party developers. As a consequence, we should expect blogs in 2015 complaining about their "hostility" towards developers. It's just the way Apple has always been.

As for the issue of too few developers making money in the App Store, I have an opinion that programmers are not going to want to hear: Let go of the idea that you'll get significant income from the App Store. Consider the few megahits that happened (beer drinking app, fart app, Flappy Bird, etc) as outliers and lottery ticket winnings. Sort of like the guy that sold 1-pixel-per-dollar on a 1 megapixel billboard when the web first appeared.[1]

I believe the only realistic way to view the App Stores is to think of it as an adjunct function to something else that makes you money. In other words, you just can't write a college study iOS app for $1.99 and expect income but instead, you create a whole college assistance website and the iPhone app is just one gateway to that functionality. It's the website that has subscribers and generates the major revenue. It's sort of like the "enterprise" model. American Airlines doesn't "sell" their iOS app; they sell plane tickets and the $0.00 app is just their way for customers to conveniently generate boarding passes, check gate times, etc.

That's the financial picture I've come to accept from the App Store and I recommend programmers consider it to avoid heartache. There's no need to complain about getting a "bigger piece of the pie" from the App Store because there is no (predictable) pie of any significance to fight over. I put predictable as a qualifier because I'm not dismissing that the megahits can make a lot of money. You just can't depend on it.

[1]http://www.milliondollarhomepage.com/

EDITED to fix spelling errors.


"Contrast with Apple. Steve Woz and Steve Jobs were burning the midnight oil assembling the Apple I to sell to consumers. Steve Woz was a programmer but the coding was incidental to making the Apple computer work. (E.g. his triumphant story of programming the floppy disk controller to work correctly.) They were selling end-user hardware and not a programmer's product like Microsoft. In 2011, none the top 3 guys (Steve Jobs, Tim Cook, Jony Ive) were programmers. Craig Federighi is an ex-programmer but Apple doesn't seem like the company that would hand the reigns over to him if Tim Cook stepped down. Craig is a powerful figure at Apple but obviously, he doesn't have the clout that Bill Gates did at Microsoft. The way Apple originated in 1976 had a ripple effect all the way to the present day in 2015. It guided their org structure and attitudes towards 3rd-party developers. As a consequence, we should expect blogs in 2015 complaining about their "hostility" towards developers. It's just the way Apple has always been."

Right, realistically if Cook were to step down tomorrow you would have to assume that Ive would get the call-up. I agree with your comments on Federighi. Eddy Cue doesn't seem to give off the right vibe. Phil Schiller doesn't really work - he's Marketing. Maybe the only other candidate would be Dan Riccio, who I'm surprised we don't see more of, and has largely shunned the limelight (or not been offered it) - VP of Hardware Engineering.

But very much so, while Visual Studio isn't without flaws, I think few could realistically claim that it hasn't been "the" IDE of note.


> You would have to assume that Ive would get the call-up

I would believe they would get a new CEO externally before they put/Ive accepts CEO.


I developed on both Apple II and MS-DOS. The Apple II was OK for developers, by the standards of the day: It had BASIC and an open hardware interface. In my view, the hostility towards developers started with the Mac, and may have had its roots in good intentions on Apple's part.

Apple wanted users to have a uniform, high quality UI across all apps. That required some policing and a close relationship with a small number of favored developers. Their API was probably too complex and brittle for widespread unfettered software development, and they probably didn't want their platform to be defined by a proliferation of "bad" software from developers like me who didn't care about their UI standards. Many users would accept "bad" software that got the job done: Small apps for automating business processes, engineering tools (cross assemblers for microcontrollers), games, etc. Developers turned to MS-DOS, which had a vastly simpler API. I wrote "bad" software, i.e., that respected no centralized UI standard.

I'm not sure Windows was much better before Visual Basic came along, but long into the Windows 3.1 era, programmers still used MS-DOS for simple things.

Today, I think the situation is much different. You can program an Apple because you can program any computer. Most developers have at least a dim grasp of GUI concepts, and the API's (including their documentation and development tools) have improved to the point where it's easier to use the built-in UI features of a platform than to re-invent the wheel. This doesn't necessarily produce great software, but it allows platforms to evolve without breaking existing apps. And you can write "bad" software that is utterly platform independent thanks to JavaScript running in the browser. Without the "passive curation" of an opaque API, Apple has chosen to actively curate apps


Are you sure of your history? The Apple I was just an assembled logic board, you had to provide the power supply, monitor, keyboard, case, et. Hardly a consumer aimed product. It was mainly meant for the computer hobbyist. Steve Wozniak wrote his own version of BASIC and it was fairly innovative for the time as I remember. Apple may have gone the direction you mention but it did not start off that way. Your argument might work for the original Mac but there was still quite a bit of good third party software for it, not sure of the quality of or existence of developer tools. I never went beyond hypercard in those days...


I have never heard him referred to as "Steve Woz". Steve Wozniak, sure. Woz, yes. Or even "the Woz".


But if they don't take good care of that "paltry" revenue stream, there will be a greater cost on lost device sales.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: