Win95 machines with 4 MB RAM were exceptions, not the rule. It was very painful to use such machine, as it was swapping all time time, otherwise doing nothing.
Any realistic setup had 8 MB RAM or more.
Application at that time also didn't support i18n, didn't anti alias fonts, had low-res, low-color assets, that were enough at 320x200(240)/640x480 resolution.
Windows 95 was painful to use even with 8 MB. 12 MB was minimum amount that didn't cause it to swap all the time when actually doing something. 16 MB was nice.
A Pentium 133 with 8MB and 800x600 (32bit colors) run fine in with Win95a and several open applications. Try that with Android, even with 2GB RAM and quad core CPU - the Java based system on top of Linux is quite resource hungry. Flagship Android 5 phones have at least twice the hardware spec (twice as much RAM and CPU) of the iPhone 6 and are comparable in performance and user experience (latency) and not faster. That's the difference of Object-C vs. Java. And old applications like Microsoft Office are all coded in C/C++ and parts of older applications in Assembler.
One icon in Win95 had 512 bytes (32x32, 16 color, 4 bitplanes). One icon in Android has 256 kB (256x256, true color). The 800x600, 16-bit hicolor (that's what I used at the time) framebuffer had a bit under 940 kB. The 1920x1200 truecolor has 8,8 MB, not counting the texture backing stores used by modern display servers.
The amount of RAM needed has to do with assets used by the code, not the code itself. The code itself is miniscule.
And no, Android phones do not have 4 GB RAM. Low end has 512 MB, with many phones in 1-1,5 GB range and the 2015 flagships have 3 GB. (Nexus 5 a 7 have 2 GB. Nexus 6 has 3 GB). All that without swap (where would you like to swap? To flash?). While most modern 32-bit ARM CPUs do come with LPAE, Android does not support that, so going above 4 GB will have to wait for ARMv8.
Android doesn't support LPAE? That's pretty surprising, any sources about that? LPAE doesn't need any usermode support to function. What specifically does Android do to prevent using LPAE in underlying Linux kernel?
Well, one thing is what Linux kernel supports by itself, other is, what does the board support package for your chipset. So maybe there is LPAE Android device somewhere, where the SoC provider did bother, but in general, nobody does.
You phone is also Intel based, not ARM. That opens another question - would Intel be able to make phone SOCs, if Android SDK would compile to native ARM code, as some advocates prefer?
That shows how little you understand about the Android build tools. Dalvik and ART are not compilers.
Prior to this year, javac compiled the Java code to .class files and then dx translated the Java bytecode in the .class files into Dalvik bytecode in a .dex file, with some simple dedupe optimizations.
Only this year did the Android build system switch to Google's own compiler.
Go take a degree in computer science, learn about intermediate code representation, compiler frontened, compiler backend, CPU instructions, JIT compiler, AOT compiler, register selection.
Then make little drawings about which piece of Android is converting intermediate code representation into native CPU instructions.
For brownie points compare the quality of generated Asssembly code between Hotspot, Dalvik and ART for the same unmodified jar file.
Already done and wrote a non-optimizing lisp compiler and an optimizing toy compiler with common subexpression elimination and fancy register allocation.
I gather from your response that you've realized you were wrong about Android not using javac but were too proud to admit it. Don't worry, we can fix your pride problem with these tasks below:
1. Dalvik and ART don't take jar files as input, so it is impossible to get your brownie points. Learn why.
2. Oracle's Hotspot targets x86 and x86-64, and Dalvik and ART are mostly focused on ARM. Learn the difference between ISAs.
3. Hotspot and Dalvik make different tradeoffs between CPU and memory both in their choices of garbage collectors and in their JIT strategies. Think about why that would be.
4. The word "compiler" by itself refers to a program that translates source code into object code. Notably, an assembler is not usually considered to be a compiler, and JIT "compilers" were originally called dynamic translators for three decades, with JIT compiler only appearing in the 90s. Given that terminology background, figure out why most people would call javac a compiler but not Hotspot or Apple's Rosetta.
> Already done and wrote a non-optimizing lisp compiler and an optimizing toy compiler with common subexpression elimination and fancy register allocation.
And yet failed to grasp the difference between frontend, backend and intermediate execution format.
> I gather from your response that you've realized you were wrong about Android not using javac but were too proud to admit it. Don't worry, we can fix your pride problem with these tasks below:
I don't have to acknowledge anything. Anyone knows that javac does not execute code on the Android platform. As such talking about whatever influence it might have on runtime performance, besides peephole optimizations, constant folding and similar AOT optimizations only reveals ignorance about the Android stack.
> 1. Dalvik and ART don't take jar files as input, so it is impossible to get your brownie points. Learn why.
Yes they do. Jar files get converted into dex files, which means the same file can be used as canonical input for both platforms.
Then again we are learning about Android aren't we?
> 2. Oracle's Hotspot targets x86 and x86-64, and Dalvik and ART are mostly focused on ARM. Learn the difference between ISAs.
Maybe you are the one that should inform yourself about Oracle and certified partners Java JIT and AOT compilers for ARM platforms.
Learn about the Java eco-system.
> 3. Hotspot and Dalvik make different tradeoffs between CPU and memory both in their choices of garbage collectors and in their JIT strategies. Think about why that would be.
Of course they do different tradeoffs. The ones made by Dalvik and ART are worse than approaches taken by other Java vendors, hence why they generate worse code, which leads to bad performance.
Learn about commercial embedded JVMs.
>4. The word "compiler" by itself refers to a program that translates source code into object code. Notably, an assembler is not usually considered to be a compiler, and JIT "compilers" were originally called dynamic translators for three decades, with JIT compiler only appearing in the 90s. Given that terminology background, figure out why most people would call javac a compiler but not Hotspot or Apple's Rosetta.
Learn about Xerox PARC documentation and its references JIT compilers.
Or better yet feel free to dive into OS/400 documentation about its kernel level JIT compiler.
All of which go back a little earlier than the 90's
I remember 386DX (40 MHz) with 4 MB to be unusable at all (yes, it was possible to install Win95, but that's all) and Pentium 120 with 16 MB and S3 card running 800x600 hicolor to be great. In 1996.
Any realistic setup had 8 MB RAM or more.
Application at that time also didn't support i18n, didn't anti alias fonts, had low-res, low-color assets, that were enough at 320x200(240)/640x480 resolution.