One X vs. One S. Performance and dev - HTC One X

Getting a new phone as I ran over my Razr with my landcruiser 40..
Live in Norway so I would be getting the EU version of the X with tegra 3.
But looking at the benchmarks of the us version (dual core) of the X, it is clearly very fast. Wondering if we would get similar performance out of the S? And would it be as "xda friendly" as I suspect the X will become?
Money is not the issue, just not sure if I would be comfortable with such a large phone.. (well, the Razr had bezels from hell, so it was very wide)

buljo said:
Getting a new phone as I ran over my Razr with my landcruiser 40..
Live in Norway so I would be getting the EU version of the X with tegra 3.
But looking at the benchmarks of the us version (dual core) of the X, it is clearly very fast. Wondering if we would get similar performance out of the S? And would it be as "xda friendly" as I suspect the X will become?
Money is not the issue, just not sure if I would be comfortable with such a large phone.. (well, the Razr had bezels from hell, so it was very wide)
Click to expand...
Click to collapse
One S Krait is fast for single app and for 2 apps, but clearly Tegra 3 outperforms Krait in multi-app and gaming performance. So i would say Tegra3 is more future than dualcore krait.
Sent from my GT-I9000 using xda premium

HTC one s
i m a lucky guy with already an HTC One S. I did the same benchmark than the ones published for the HTC One XL and I ve got the same results.

Rastasia said:
i m a lucky guy with already an HTC One S. I did the same benchmark than the ones published for the HTC One XL and I ve got the same results.
Click to expand...
Click to collapse
Proof please of phone ownership.
It doesn't make any sense for HTC to release their flagship with a less powerful processor than their "mid-range" phone.
It wouldn't be the first time a mobile phone company forgot about what makes sense, but it's not like the HTC one X is going to be underpowered, regardless.
I really don't like the look of the pentile screens, which was the main deciding factor for the One X for me.
From the comments on that benchmark blog post it seems the tests are unrealistic; the scores for the alternatives are artificially low (iirc)

One S at its native QHD res vs One X at its native HD res, they trade blows and almost equal, the One X will show its muscles in quad optimised apps only
as for One X vs One XL = One X is better since T3 is better than dual s4 @ 720p

qpop said:
Proof please of phone ownership.
It doesn't make any sense for HTC to release their flagship with a less powerful processor than their "mid-range" phone.
It wouldn't be the first time a mobile phone company forgot about what makes sense, but it's not like the HTC one X is going to be underpowered, regardless.
I really don't like the look of the pentile screens, which was the main deciding factor for the One X for me.
From the comments on that benchmark blog post it seems the tests are unrealistic; the scores for the alternatives are artificially low (iirc)
Click to expand...
Click to collapse
It doesn't makes sense, but they have crippled the one s with the pentile screen and low storage.
The s4 chip in the one s is a generation ahead of anything else until the arm A15 chips arrive. Qualcomm krait is supposed to be much closer to a15 spec than the A9 in tegra 3.

proof
http://www.flickr.com/photos/[email protected]/7026471353/
http://www.flickr.com/photos/[email protected]/6880371452/
http://www.flickr.com/photos/[email protected]/7026471425/

Rastasia said:
http://www.flickr.com/photos/[email protected]/7026471353/
http://www.flickr.com/photos/[email protected]/6880371452/
http://www.flickr.com/photos/[email protected]/7026471425/
Click to expand...
Click to collapse
congrats on your sexy beast mate, i love the One S
but these benchs are nothing new, Velloma is a not heavily multi-threaded Qualcomm test
excellent Device, what color did you get?

Thé blue/grey one. Do u want me to test on an other benchmark?
Sent from my HTC One S using XDA

Rastasia said:
Thé blue/grey one. Do u want me to test on an other benchmark?
Sent from my HTC One S using XDA
Click to expand...
Click to collapse
best color mate! i bet its gorgeous
yes try Antutu https://play.google.com/store/apps/details?id=com.antutu.ABenchMark&feature=search_result
and GL benchmark offscreen 720p tests https://play.google.com/store/apps/details?id=com.glbenchmark.glbenchmark21&feature=search_result#?t=W251bGwsMSwxLDEsImNvbS5nbGJlbmNobWFyay5nbGJlbmNobWFyazIxIl0.

congrats on having the new device..
may i ask why did u got it so fast... ?!?!

fi3ry_icy said:
congrats on having the new device..
may i ask why did u got it so fast... ?!?!
Click to expand...
Click to collapse
it s a test device from a provider
---------- Post added at 02:47 PM ---------- Previous post was at 02:37 PM ----------
hamdir said:
best color mate! i bet its gorgeous
yes try Antutu https://play.google.com/store/apps/details?id=com.antutu.ABenchMark&feature=search_result
and GL benchmark offscreen 720p tests https://play.google.com/store/apps/details?id=com.glbenchmark.glbenchmark21&feature=search_result#?t=W251bGwsMSwxLDEsImNvbS5nbGJlbmNobWFyay5nbGJlbmNobWFyazIxIl0.
Click to expand...
Click to collapse
can t get your second benchmark but here s your first request
http://www.flickr.com/photos/[email protected]/7026577361/

Rastasia said:
it s a test device from a provider
---------- Post added at 02:47 PM ---------- Previous post was at 02:37 PM ----------
can t get your second benchmark but here s your first request
http://www.flickr.com/photos/[email protected]/7026577361/
Click to expand...
Click to collapse
thanks you just confirmed my point of view, Antutu shows off the quads a lot better than a qualcomm test
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

HTC ONE X is a wat better with Tegra
hamdir said:
thanks you just confirmed my point of view, Antutu shows off the quads a lot better than a qualcomm test
Click to expand...
Click to collapse
This is proof
source
http://www.youtube.com/watch?v=TmWRaaAteZg

those who haven't checked Anandtechs' review of the iPad3 i suggest you do, its full of juicy information
having settled all this info in my mind, it's quit easy to draw a clear picture
the Tegra3 is a chip-set that jumped most competitors with such an early entry to quad mobile CPUs, the only other quad in the market is the PS Vita's SOC (Sony CXD5315 build by Toshiba), both quad A9, early worries about memory bandwith/L2 cache are unfounded simply because ARM A9's memory controller can't keep up with more
Snapdragon 4 introduced dual memory channel, major optimization and performance per core, however its advantage is offset by the lack of cores, its a good design for quad but right now its excellent cores will still be stalled by multi-tasking and it's amazing memory bandwidth will go to waste
Right now Tegra3 is still the best Mobile CPU you can have, better than the A5x and dual S4, however the major let down of the Tegra 3 is it's GPU
Nvidia claimed 12 cores GPU on the T3 but that's simply the number of SIMDs and not physical cores, ARM also names its SIMDs as core, but this only confuses customers, Nvidia will use this naming scheme to counter Apple claims, like Asus already responded on twitter
ARM SGX543 MP series has physical core scaling, but its only 8 SIMDs per Core, vs 12 on the Tegra 3, 8 on Adreno225 and 5 on Mali-400, the A5 has 2 cores and hence 16 SIMDs while the A5x has 4 cores and hence 32 SIMDs
In reality the Tegra3 GPU falls a little short of the iPad 2 GPU, while it beats Mali-400 and Adreno225 in most situations but not all areas, Nvidia extracted all they can from this GPU by some aggressive drivers optimization and hacks, this is how they achieved their 3x Tegra2 claim i.e: its already optimized don't expect much room here
Nvidia's GPU is really disappointing but not a disaster, it just doesn't hold a lot of overhead, right now its still the fastest GPU for android and has the quad to back it up once an app is T3 optimized, the quads can add console quality gameplay additions like ragdoll, physics and particles but might not improve FPS (this will require an engine written from grounds up for multi core and i doubt devs will be inclined)
The iPad3 GPU is massively powerful, a testament to the PS Vita's GPU, however unlike the Vita its power is wasted on those pixels and hence games will benefit from it but not the 2x jump from current iPad2/iPhone4s games, like infinity blade 2 shows, it only managed a 1.4x resolution increase without loosing frame rate, so yes most 3D games on the iPad3 will not be retina boosted, why do i keep bringing up iPad? because iOS is still the leader when it comes to mobile gaming and most games we get on Android are ports, the future of iOS games will draw the future of Android games
All this makes me conclude the following
Android's main appeal is still the OS, what you can do with it and multi tasking, which translates into the main appeal for Tegra3, its ridiculous to even think quad cores can not benefit such a heavily multi-threaded OS
Android is still not the best platform for gaming but wether we like it or not, it's best grounds for gaming is Tegrazone simply because we have Nvidia pushing/bribing developers in this direction
If you are buying an Android device right now the best you can do is Nvidia Tegra3 but damn you Nvidia for not being more generous
its been the case for ages, asymmetry between CPU and GPU power, xbox360 had a more power GPU against its CPU, PS3 had the CPU against its GPU, Apple A5x has its GPU against last year CPU, the only SOC that satisfies both angles is PS Vita with its quad CPU and quad GPU but that's because Sony has to worry about the product life cycle which is over 4 years
so you can see Tegra3 has the CPU against its GPU, its not really breaking the norms in here, its business as usual

went for the S.
well, just posted in the S forums..
http://forum.xda-developers.com/showthread.php?p=24454178#post24454178
ended up with the one S instead of the X. the feel of the phones did it for me.

Related

[SGH-T989] Qualcomm based SGS2 "Celox" aka "Hercules"

So this is interesting. There was a lot of confusion about T-Mobile U.S.' new phone the Hercules and whether or not it was an SGS2 variant. Well it is and it isn't. This link talks about a SGS2 version launching in Korea and Germany that uses the same Qualcomm SoC as the Sensation. That's an interesting choice because the Sensation does poorly on benchmarks. Other than being LTE equipped it's the same specs and looks the same at the T-Mobile U.S. Hercules. So apparently Samsung's being pretty liberal with what they define as a SGS2.
http://sammyhub.com/2011/08/09/is-this-samsung-galaxy-s-ii-lte-phone-codenamed-celox/
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I'm not sure I would be interested in this. It's a larger screen (4.5" is too big) and it will probably get worse battery life with LTE. I think I'll wait for the Galaxy S III.
smartbot said:
I'm not sure I would be interested in this. It's a larger screen (4.5" is too big) and it will probably get worse battery life with LTE. I think I'll wait for the Galaxy S III.
Click to expand...
Click to collapse
The 8060 SoC sort of sucks in the Sensation. That would kill it for me before the 4.5" screen. It's actually using the same chip that's in the HP TouchPad. Interesting choice on Samsung's part. The radio is market specific so LTE won't be everywhere.
It looks like Samsung is trying to cash in on galaxy s name and push as many phones as they can.
Probably short term business decision regardless of consequences to its name.
BarryH_GEG said:
So this is interesting. There was a lot of confusion about T-Mobile U.S.' new phone the Hercules and whether or not it was an SGS2 variant. Well it is and it isn't. This link talks about a SGS2 version launching in Korea and Germany that uses the same Qualcomm SoC as the Sensation. That's an interesting choice because the Sensation does poorly on benchmarks. Other than being LTE equipped it's the same specs and looks the same at the T-Mobile U.S. Hercules. So apparently Samsung's being pretty liberal with what they define as a SGS2.
http://sammyhub.com/2011/08/09/is-this-samsung-galaxy-s-ii-lte-phone-codenamed-celox/
Click to expand...
Click to collapse
Actually the Sensation chip sin't nearly as crappy as it was prior to getting s-off. Once developers were able to make kernel mods and other tweaks the chip performs much better than it did out of the box. I think some of the poor benchmark scores can be attributed to the qHD screen of the Sensation. However, I ran cf-bench last night with both my sgs2 and sensation clocked at 1.5ghz and the Sensation beat it each time. The gpu of the adreno 220 is surprisingly good. I would be interested to see the qualcomm chip properly implemented such that the hardware and software were coded in sync
jlevy73 said:
Actually the Sensation chip sin't nearly as crappy as it was prior to getting s-off. Once developers were able to make kernel mods and other tweaks the chip performs much better than it did out of the box. I think some of the poor benchmark scores can be attributed to the qHD screen of the Sensation. However, I ran cf-bench last night with both my sgs2 and sensation clocked at 1.5ghz and the Sensation beat it each time. The gpu of the adreno 220 is surprisingly good. I would be interested to see the qualcomm chip properly implemented such that the hardware and software were coded in sync
Click to expand...
Click to collapse
If the same chip performs significantly faster in Samsung's implementation the Sensation folks are going to be pissed. I'd also imagine Samsung will do a much better job with video drivers so it'll support tons more formats than the Sensation. With all the rumors about Tegra being the alternate due to Exynos shortages it's interesting they went with Qualcomm.
I'd like my SII to have a 4.5" screen and the back cover of this phone. That's all.
I'm not real smart on this aspect of this technology (chips and perfrormance), but I really have doubt that the benchmarks accurately reflect real world performance.
I have an SGS2 and say take my Evo 3D, turn on hotspot, run my SGS2 off the Evo, and do a Speedtest app test on each; the Evo measures 7-9M's, the SGS2 runs 3ish.
I immediately run a series of graphic heavy site's simultaneously and the SG will finish quicker every time.
I'll run a comparison of Thunderbird on the almighty Verizon LTE.
The Speedtests are Th:19-21 lol, SGS (Hotspot... ting to Wimax) measuring 3ish, and and AGAIN,when it comes to site downloads SGS2 is just faster (more marginally).
That said, the Sensation was a disappointment. Makes sense to me that it wasn't all the chips fault.
But...all that said,gut tells me, those enjoying the SGS2 like I have are gonna be up for a let down in performance with Herc.
Hope I'm wrong. Been anticipating it myself.
rockky said:
I'm not real smart on this aspect of this technology (chips and perfrormance), but I really have doubt that the benchmarks accurately reflect real world performance.
I have an SGS2 and say take my Evo 3D, turn on hotspot, run my SGS2 off the Evo, and do a Speedtest app test on each; the Evo measures 7-9M's, the SGS2 runs 3ish.
I immediately run a series of graphic heavy site's simultaneously and the SG will finish quicker every time.
I'll run a comparison of Thunderbird on the almighty Verizon LTE.
The Speedtests are Th:19-21 lol, SGS (Hotspot... ting to Wimax) measuring 3ish, and and AGAIN,when it comes to site downloads SGS2 is just faster (more marginally).
That said, the Sensation was a disappointment. Makes sense to me that it wasn't all the chips fault.
But...all that said,gut tells me, those enjoying the SGS2 like I have are gonna be up for a let down in performance with Herc.
Hope I'm wrong. Been anticipating it myself.
Click to expand...
Click to collapse
You're basically comparing LTE (VZW), Wi-Max (Sprint), and HSPA+ (AT&T) which has nothing to do with the phone's processor. Play HD videos on the Sensation/E3D (Qualcomm) and SGS2 (Exynos) and you'll be quite surprised at the difference in real world performance. jlevy73 brings up an interesting point in that devs seem to be getting better perfromance out of the Sensation now that it's unlocked than HTC was able to. But devs are still dependent on the drivers provided by the OEMs so the Qualcomm chip on HTC phones might still end up having a real world performance deficit no matter how much dev support it gets.
BarryH_GEG said:
You're basically comparing LTE (VZW), Wi-Max (Sprint), and HSPA+ (AT&T) which has nothing to do with the phone's processor. Play HD videos on the Sensation/E3D (Qualcomm) and SGS2 (Exynos) and you'll be quite surprised at the difference in real world performance. jlevy73 brings up an interesting point in that devs seem to be getting better perfromance out of the Sensation now that it's unlocked than HTC was able to. But devs are still dependent on the drivers provided by the OEMs so the Qualcomm chip on HTC phones might still end up having a real world performance deficit no matter how much dev support it gets.
Click to expand...
Click to collapse
OK...but still don't totally understand. Processors aren't a factor in how fast data is transmitted?
rockky said:
OK...but still don't totally understand. Processors aren't a factor in how fast data is transmitted?
Click to expand...
Click to collapse
The incoming data isn't coming in fast enough to tax the processor. Testing something locally on the phone like video, flash-based web pages, and running multiple apps are a better test of a processors performance. Software and drivers make a big difference too. The browser on the SGS2 is hardware optimized where the Sensation/E3D's are not and it shows in everyday use.
BarryH_GEG said:
The incoming data isn't coming in fast enough to tax the processor. Testing something locally on the phone like video, flash-based web pages, and running multiple apps are a better test of a processors performance. Software and drivers make a big difference too. The browser on the SGS2 is hardware optimized where the Sensation/E3D's are not and it shows in everyday use.
Click to expand...
Click to collapse
Thanks. Good to know that.
BarryH_GEG said:
The incoming data isn't coming in fast enough to tax the processor. Testing something locally on the phone like video, flash-based web pages, and running multiple apps are a better test of a processors performance. Software and drivers make a big difference too. The browser on the SGS2 is hardware optimized where the Sensation/E3D's are not and it shows in everyday use.
Click to expand...
Click to collapse
Suffice to say then that the US devices will suffer some in the performance debt if the Qualcomms are employed vs
Vs the Exynos??
rockky said:
Thanks. Good to know that.
Click to expand...
Click to collapse
lol 3G/4G is like your internet connection on PC has nothing to do with how powerful the CPU is.
nraudigy2 said:
lol 3G/4G is like your internet connection on PC has nothing to do with how powerful the CPU is.
Click to expand...
Click to collapse
That's true, but how fast a page renders, especially one heavy in javascript and flash does provide an insight into the cpu/gpu. I have the crapbolt, I mean Thunderbolt and LTE absolutely flies (i.e. 30mb/down). With my SGS2 on AT&T's network I get about 5mb/down. If I load up androidcentral.com (which is very heavy on graphics, flash, etc) the SGS2 renders the page 2-3X faster than my Thunderbolt. You can see the rat in the cage processor of the Thunderbolt choking to render all those graphics.
jlevy73 said:
Actually the Sensation chip sin't nearly as crappy as it was prior to getting s-off. Once developers were able to make kernel mods and other tweaks the chip performs much better than it did out of the box. I think some of the poor benchmark scores can be attributed to the qHD screen of the Sensation. However, I ran cf-bench last night with both my sgs2 and sensation clocked at 1.5ghz and the Sensation beat it each time. The gpu of the adreno 220 is surprisingly good. I would be interested to see the qualcomm chip properly implemented such that the hardware and software were coded in sync
Click to expand...
Click to collapse
Unfortunately....
qhd dosent have much to do with it.
the gpu on the SD gets OC'd as the cpu is oc.
the Ex is a more robust processor + Mali is a more powerful GPU.
Maedhros said:
Unfortunately....
qhd dosent have much to do with it.
the gpu on the SD gets OC'd as the cpu is oc.
the Ex is a more robust processor + Mali is a more powerful GPU.
Click to expand...
Click to collapse
The sensations qhd screen has 35% more pixels than the samsung s ii. It has a significant impact on processor work load and benchmarks. At 1.2GHz the Exynos 4210 is much better than the Qualcomm 8060, but at 1.5Ghz the Qualcomm will outperform an Exynos at 1.2Ghz.
FishTaco said:
The sensations qhd screen has 35% more pixels than the samsung s ii. It has a significant impact on processor work load and benchmarks. At 1.2GHz the Exynos 4210 is much better than the Qualcomm 8060, but at 1.5Ghz the Qualcomm will outperform an Exynos at 1.2Ghz.
Click to expand...
Click to collapse
Sadly, it doesn't. Maybe if you removed vcore it would but I have both devices and even clocked to 1.7, the Sensation cannot match the SGS2 in any benchmark I tried except cf-bench.
FishTaco said:
The sensations qhd screen has 35% more pixels than the samsung s ii. It has a significant impact on processor work load and benchmarks.
Click to expand...
Click to collapse
But if that is the case, then there isn't much excuse for the fact almost every Tegra 2 device with qhd displays out there, 1ghz, has beaten the Qualcomm chip in the sensation on every review I have watched. The Tegra devices use qhd's and yet are clocked lower the the sensations, yet out performs it significantly. Further more, about the pixels, the sensation only displays most quadrants on the 480*800 pixels anyway, because for some reason quite applications aren't scaled propyl for example in quadrant benchmark. Also, because most bechmarks count frames on 2d/3d graphics to help sus speed, I often find my galaxy s2 always hovers around 60fps. Thats because it has been limited to that by Samsung, so the true bechmark speed of that galaxy s2 is higher than what is show on stock firmware.
danielsf said:
But if that is the case, then there isn't much excuse for the fact almost every Tegra 2 device with qhd displays out there, 1ghz, has beaten the Qualcomm chip in the sensation on every review I have watched. The Tegra devices use qhd's and yet are clocked lower the the sensations, yet out performs it significantly. Further more, about the pixels, the sensation only displays most quadrants on the 480*800 pixels anyway, because for some reason quite applications aren't scaled propyl for example in quadrant benchmark. Also, because most bechmarks count frames on 2d/3d graphics to help sus speed, I often find my galaxy s2 always hovers around 60fps. Thats because it has been limited to that by Samsung, so the true bechmark speed of that galaxy s2 is higher than what is show on stock firmware.
Click to expand...
Click to collapse
Yes qhd makes a difference. Look at PC's as an example.
Have you seen what a SGS2 does with Tegra 2? You'll be surprised.
Sensation is not even A9 cortex based, can't compete with the rest of dual-cores.

Sensation XE or Galaxy Nexus ??

What you guys think is the best smartphone to buy at the moment sensation XE or Galaxy Nexus ???
I am a desire user and was planing to sensation XE but now that the Galaxy Nexus is announced I am bit confused.. However, I really like the new nexus and now I am thinking of buying Galaxy nexus. Reason for that is the UI is now getting very similar to htc sense and new hardware features such as the HD screen, android beam etc. I will make my finally decision after reading nexus reviews.
What you guys say?? Which one you will choose ?
I´d rather wait a little longer (a few weeks or so) and see what other devices will get an ICS update.
This would make some other current devices much more appealing.
E.g. those Motorola ones.
HTC might not update their new devices with ICS as it might slow down their phones with HTC Sense on top of it...
Personally I wouldn't want either; holding my Atrix firmly in one hand, I can just about reach the top left and bottom left corners of the touchscreen with my thumb. That would mean on a device like this I'd have to use it two handed... Why are high-end phones getting so giant these days?!
But given the choice between the two, I'd take the Sensation XE. I suspect the Galaxy Nexus will suffer rather badly in 3D performance as a result of using an SGX 540 (albeit one paired with dual-channel memory) for such a high-res screen. We know for sure it's going to get walked all over by the iPhone 4S with its 960x640 screen w/SGX543MP...
Depending on when current-gen Tegra 2/OMAP 4/Exynos/Snapdragon S3 devices get ICS updates, they may prove to be a better choice for that reason alone. This time next year, 1280x720 will probably be more viable. I suspect A5 is the only SoC with the grunt right now. Still, I could be proven wrong.
Edit: This is in the Desire forum? I'm confused...
alpha-dog said:
HTC might not update their new devices with ICS as it might slow down their phones with HTC Sense on top of it...
Click to expand...
Click to collapse
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sense with HW accel = WIN!
im just going by what theyve said
http://www.engadget.com/2011/10/19/htc-were-reviewing-ice-cream-sandwich-and-determining-our-plan/
actually I dont like Samsung phones... and both galaxy nexus and Sensation XE/XL are just TOO BIG >.>
Id take a HTC Bliss or wait some more
Azurael said:
Personally I wouldn't want either; holding my Atrix firmly in one hand, I can just about reach the top left and bottom left corners of the touchscreen with my thumb. That would mean on a device like this I'd have to use it two handed... Why are high-end phones getting so giant these days?!
But given the choice between the two, I'd take the Sensation XE. I suspect the Galaxy Nexus will suffer rather badly in 3D performance as a result of using an SGX 540 (albeit one paired with dual-channel memory) for such a high-res screen. We know for sure it's going to get walked all over by the iPhone 4S with its 960x640 screen w/SGX543MP...
Depending on when current-gen Tegra 2/OMAP 4/Exynos/Snapdragon S3 devices get ICS updates, they may prove to be a better choice for that reason alone. This time next year, 1280x720 will probably be more viable. I suspect A5 is the only SoC with the grunt right now. Still, I could be proven wrong.
Edit: This is in the Desire forum? I'm confused...
Click to expand...
Click to collapse
Sorry being a noob, but what makes the A5 better than the processor in the galaxy nexus? I thought the nexus was 1.5Ghz dual-core, surely that's faster?
I am in the excatly same position, Im due for an upgrade. It took a long time to decide I wanted the Sensation XE because it looks so good with the red highlights and the Sense UI is a big favourite after having the Desire for 2 years now.
Now the Nexus has been announched I dont know whether to wait and get that. The Nexus vanilla ICS is really tempting cause it looks so good and is such a simplistic OS, and I think once HTC sense gets layed over the top it wont look anythink like that. The things that put me off the Nexus is that I think the it's too big. 4.65 inch is massive. The sensation 4.3 inch is bordering on too big.
It's a hard decision when you may own it on a contract for another 2 years, that Nexus is so future proofed you could own it for 4 years and not brake sweat about being out of date.
Sorry if this is dumbed down but I'm not excatly an expert on the specifics but any advice would be a bonus?
theboymini said:
The Nexus vanilla ICS is really tempting cause it looks so good and is such a simplistic OS, and I think once HTC sense gets layed over the top it wont look anythink like that.
Click to expand...
Click to collapse
6chrisp said:
Sorry being a noob, but what makes the A5 better than the processor in the galaxy nexus? I thought the nexus was 1.5Ghz dual-core, surely that's faster?
Click to expand...
Click to collapse
It depends on how optimised the processor is to work with the phone. A5 is made specifically for iOS devices only so it will work way better than the OMAP processor used in the Galaxy Nexus as the processor has to adapt to a lot of different device who uses it. Also, Galaxy Nexus has a 1.2 Ghz Dual Core OMAP processor, not 1.5 Ghz.
Sent from my HTC Original Desire using Tapatalk
isnt the sensation XL the 16GB internal no microSD slot version?
if so i'd pass on that instantly.
The HTC Rezound (a.k.a Vigor) would be the one when or if it comes out and had the specs it's saying it might have.
theboymini said:
The HTC Rezound (a.k.a Vigor) would be the one when or if it comes out and had the specs it's saying it might have.
Click to expand...
Click to collapse
true - isnt that like the senesation xe + more ram? (1gb rather than 3/4GB)
For me either smaller device ~4 inches or Galaxy Nexus - basically HTC for me have two big assets - Sense and shells (casing). Sense is much more polished from GB but the gap narrows with ICS, especially with some extra apps. While SGS/Nexus S has completely ridiculous material on the back, SGSII and nexus Galaxy seems to find some improvements in this matter.
BTW: Thank God (and admins) for ignore feature in this forum...
Lothaen said:
true - isnt that like the senesation xe + more ram? (1gb rather than 3/4GB)
Click to expand...
Click to collapse
Shud have the 4.3 inch 720p display (like the nexus) and the 1gb ram. Only it they put in 16/32GB memory with expandable microSD it would be the one. Also, as YorickRise states a nice solid case rather than those plastic ones Samsung insists on making so their phones are lighter.
Just got to wait....
itachi1706 said:
It depends on how optimised the processor is to work with the phone. A5 is made specifically for iOS devices only so it will work way better than the OMAP processor used in the Galaxy Nexus as the processor has to adapt to a lot of different device who uses it. Also, Galaxy Nexus has a 1.2 Ghz Dual Core OMAP processor, not 1.5 Ghz.
Sent from my HTC Original Desire using Tapatalk
Click to expand...
Click to collapse
An Cortex A9 dual core is a Cortex A9 dual core. In terms of CPU performance there shouldn't be a great deal of difference between any of them unless the app is using NEON which isn't supported on the Tegra 2. And the dual core Qualcomms perform a bit different because Snapdragon is A8 compatible, but not actually based on the A8 or A9 cores, its Qualcomms own design. However, the GPU is different in all these chips, and that's a where the difference comes in. OMAP 4 as seen in the Galaxy Nexus and newer Moto devices uses the old sgx540 which you nay know from the original Galaxy S inside Samsungs hummingbird single cores. It was the best mobile GPU in its day and has been clocked much faster and paired with dual channel memory in the OMAP 4 giving it similar performance to Mali 400MP in the Exynos, Adreno 220 in the Qualcomms and slightly ahead of GeForce ULP in the Tegra 2. Not enough, IMHO for such a high-res screen such as that in the Galaxy Nexus. The shiny new sgx543mp in the Apple A5 blows all of the competition out of the water at the moment though! However we will see other SoCs in early 2012 that are competitive from a GPU standpoint.
Sent from my MB860 using Tapatalk
theboymini said:
Shud have the 4.3 inch 720p display (like the nexus) and the 1gb ram. Only it they put in 16/32GB memory with expandable microSD it would be the one. Also, as YorickRise states a nice solid case rather than those plastic ones Samsung insists on making so their phones are lighter.
Just got to wait....
Click to expand...
Click to collapse
Those flimsy plastic cases, while not feeling so great in the hand and looking pretty tacky for a highend device are actually a lot more impact resistant than the metallic bodied phones. Look for Galaxy SII drop tests on YouTube if you don't believe me. It mirrors the experiences I've had with laptops (I.e. Apples pro line with the metal cases have lasted a lot less well that the plastic-bodied non-pro models.)
Sent from my MB860 using Tapatalk
itachi1706 said:
It depends on how optimised the processor is to work with the phone. A5 is made specifically for iOS devices only so it will work way better than the OMAP processor used in the Galaxy Nexus as the processor has to adapt to a lot of different device who uses it. Also, Galaxy Nexus has a 1.2 Ghz Dual Core OMAP processor, not 1.5 Ghz.
Sent from my HTC Original Desire using Tapatalk
Click to expand...
Click to collapse
Actually that's a no. A dualcore 1.2 ghz processor contains exactly 2 cores capable of exactly 1.2 billion cycles per second. No more and no less, and compared to the apple a5 in the 4s clocked at 800 mhz that is a 50% increase.
The a5 gpu however is indeed a tad better
edit: and i have to completely disagree that apple in any way can utilize the raw processing power any better than other companies.. That is just plain wrong. They can and have however build the system to utilize the gpu for transition effects and simple animations just as google does in hc and onwards.
mortenmhp said:
Actually that's a no. A dualcore 1.2 ghz processor contains exactly 2 cores capable of exactly 1.2 billion cycles per second. No more and no less, and compared to the apple a5 in the 4s clocked at 800 mhz that is a 50% increase.
The a5 gpu however is indeed a tad better
edit: and i have to completely disagree that apple in any way can utilize the raw processing power any better than other companies.. That is just plain wrong. They can and have however build the system to utilize the gpu for transition effects and simple animations just as google does in hc and onwards.
Click to expand...
Click to collapse
iOS 5 is clearly better optimised than Android 2.3 in most benchmarks.... Especially in browser-related tests like JavaScript benches. However, we'll see how 4.0 handles as 3.0 on tablets is a lot closer to iOS benchmarked on similar hardware. And not all CPUs perform the same at the same clockspeed. Do you really think a dual core Atom (simple, in order core, limited cache and bandwidth and less execution units) performs the same as an i3 (complex, out-of-order core with loads of cache, loads of bandwidth and lots of execution units) at the same clock, for example? In fact, the reason I choce this particular comparison is it's an extreme with CPUs sharing the x86 instruction set; the i3 would be more than twice as fast in most cases.
CPU performance depends on a lot of things; for example the number of execution units inside capable of a given operation, pipeline length, cache optimisations, memory bandwidth, bus speeds, the efficiency of the instruction resceduler (for out of order CPUs) and a number of other factors. Even CPUs with the same cores (like ARM's A9 for example) can perform differently - some (like Ti's OMAP) have dual channel memory whereas Tegra 2, for example is constrained to a single channel, although this is much more likely to affect GPUs (which are also integrated and share memory bandwidth with the CPU) than CPUs with current cores. The CPU cores in Snapdragon S3, particularly, perform quite differently (a little worse in most cases) than other current-gen dual-core ARM chips due to their use of Qualcomm's Scorpion CPU core (which is an arm v7l chip compatible with, but not identical to an A8 - a single Scorpion is faster than a single A8 due to partial out-of-order support but the more complete out-of-order support and shorter instruction pipeline means A9 will perform better per core at the same clock than Snapdragons.)
And that's before we even mention instruction set extensions like NEON and SSE - when running code which is optimised for and can take advantage of these (which tend to be media-related apps like video encoding) you could end up with orders of magnitude difference in performance. The implementation of Sandy Bridges AVX extensions allow them to double performance at the same clock in linpack benchmarks versus the previous generation 'Nehalem' based chips for example.
Oh, and SGX543MP2 isn't just a tad faster than anything we have in Android hardware at the moment, it's A LOT faster, especially given that A5 uses dual channel memory and everything we have bar the TI OMAP 4 with its aging (though fast-clocked) SGX540 is single channel. Also bear in mind that the iPhone 4/s GPU is dealing with a 960x640=614,400 pixel display whereas many high-end Android devices (GSII for example) are still only packing 800x480 displays with 384,000 pixels and the GPU has to do a lot more work to render 60% more pixels! - be careful when comparing benchmarks!

[Q] Single Core GPU

All my friends are big time apple fan. I respect their choice and I do find that there are some things which are great in apple products. So whenever I buy my smartphone I try to make sure that it is as powerful as iPhone, at least hardware wise. iPhone 4S has a dual core GPU where as Galaxy Nexus has only a single core??!!!
I was actually thinking of grabbing the Motorola Razr but after realizing that it has bit slow processor and also a single core GPU I thought I will look forward for Galaxy Nexus. Unfortunately it seems like G.Nexus has a single core GPU too.
Does any one knows how much does it matter, when compared to a dual core GPU? Especially when you are using the HDMI cable to watch something in your TV.
Thanks
PS: The only phone in the market right now that could go head to head with iPhone 4s is either LG Optimus HD and HTC Rezound, I think.
Bikram said:
... Nexus has only a single core??!!!
I was actually thinking of grabbing the Motorola Razr but after realizing that it has bit slow processor and also a single core GPU I thought I will look forward for Galaxy Nexus. Unfortunately it seems like G.Nexus has a single core GPU too.
Does any one knows how much does it matter, when compared to a dual core GPU? Especially when you are using the HDMI cable to watch something in your TV.
...
Click to expand...
Click to collapse
It's only really going to matter with games. The SGX540 in the Galaxy Nexus should be able to handle hdmi output for video with no problem.
Sent from my Nexus S using Tapatalk
[hfm] said:
It's only really going to matter with games. The SGX540 in the Galaxy Nexus should be able to handle hdmi output for video with no problem.
Sent from my Nexus S using Tapatalk
Click to expand...
Click to collapse
Why does the gpu need to handle video when the omap 4460 has a dedicated iva3 processor to handle video. And to original poster. The galaxy s2 has a quad core gpu. Tegra2 has 8. Tegra 3 will have 12. It seems your phone preference is purely superficial and that you would be better off getting an iphone since you just want to be the 'my phone has more horses underneath the hood than yours' type of person.
I invented cyberspace. You're trespassing.
the more cores doesnt really mean better......I know its a different architecture but a quadcore i7 out performs a hexacore AMD bulldozers....
Cores are turning into the new megapixel
slowz3r said:
Cores are turning into the new megapixel
Click to expand...
Click to collapse
Exaclty. I can't believe how many of my friends (iPoop users) have asked me "is that phone dual core?" Because of the iPoop 4S boasting better proccessor, then they say hmm that must make a phone powerful!
I do have somethings against Apple but still without that people always want MP and now Cores. :\
Sent from my Senseless Doubleshot using xda premium
Exactly. Cores doesnt double performance when you move up to 2 and quadruple when you move to 4. A 300hp wont necessarily outperform a car with 200hp either.
I invented cyberspace. You're trespassing.
pukemon said:
And to original poster. The galaxy s2 has a quad core gpu. Tegra2 has 8. Tegra 3 will have 12.
Click to expand...
Click to collapse
Not in the sense that the A5 is a dual-core. The Mali-400MP has four pixel shaders (what they like to call quad-core) and one vertex shader. Tegra 2 has four pixel shaders and four vertex shaders, which they total to 8. Tegra 3 has 8 pixel shaders and 4 vertex shaders, for a total of twelve. They aren't cores at all, that's just BS that nVidia started a while ago to take a cheap shot at Intel.
On the other hand, the SGX543MP2 effectively has two complete GPUs - each with four pixel shaders and four vertex shaders. Hardware wise, it's pretty much equivalent to taking two of the GPUs used in the Nexus and sticking them together. Albeit at a lower clock rate, I imagine.
Not that it really makes that much difference for mobile games - art style can go a long way towards making a game look amazing on low-power hardware. The iPhone 4 has/had some pretty amazing looking games, and you'd struggle to find a higher-end Android these days whose GPU doesn't absolutely smash the iPhone 4.
Sjael said:
Not in the sense that the A5 is a dual-core. The Mali-400MP has four pixel shaders (what they like to call quad-core) and one vertex shader. Tegra 2 has four pixel shaders and four vertex shaders, which they total to 8. Tegra 3 has 8 pixel shaders and 4 vertex shaders, for a total of twelve. They aren't cores at all, that's just BS that nVidia started a while ago to take a cheap shot at Intel.
On the other hand, the SGX543MP2 effectively has two complete GPUs - each with four pixel shaders and four vertex shaders. Hardware wise, it's pretty much equivalent to taking two of the GPUs used in the Nexus and sticking them together. Albeit at a lower clock rate, I imagine.
Not that it really makes that much difference for mobile games - art style can go a long way towards making a game look amazing on low-power hardware. The iPhone 4 has/had some pretty amazing looking games, and you'd struggle to find a higher-end Android these days whose GPU doesn't absolutely smash the iPhone 4.
Click to expand...
Click to collapse
Im well aware of this. The op probably doesnt know or care. Hell, if you get a desktop class gfx card youre looking at upwards of 300 "cores" and more. It boils down to different architecture and marketing of course. The 540 is a capable mobile gpu. Nobody is quite sure how it handles higher resolutions though. Lots of cpus/gpus have quoted specs saying it can handle this and that but when you push the theoretical limits performance goes south quickly. And the 543mp2 is overkill currently. Probably a major factor of why the iphone 4s is getting ****ty battery life. Who wants to brag about how ****ty the battery life of their phone is though? These days 4 hours of display time and using your phone all day is good.
I invented cyberspace. You're trespassing.
The Galaxy Nexus is dual core.....
ssconceptz said:
The Galaxy Nexus is dual core.....
Click to expand...
Click to collapse
well whatever u do don't read what we're actually talking about... not the CPU...the GPU
Sent from my myTouch_4G_Slide using xda premium
ssconceptz said:
The Galaxy Nexus is dual core.....
Click to expand...
Click to collapse
Dual cpu. Single core gpu.
I invented cyberspace. You're trespassing.
It's not really correct to call the GPU "single core". A GPU is a parallel processing unit, what matters is the number of pixel pipelines, the number of shader units, the number of ROPs, etc. In fact, performance should scale better using a GPU that has 8 shader units on a single physical core than a GPU that has 4 shader units each on two physical cores. The only good reason to spread GPU processing units across multiple physical cores is ease of manufacturing. Usually you would expect the next generation of GPUs like this to merge the physical cores into a single physical core with the same total number of compute units.
---------- Post added at 04:27 PM ---------- Previous post was at 04:25 PM ----------
slowz3r said:
the more cores doesnt really mean better......I know its a different architecture but a quadcore i7 out performs a hexacore AMD bulldozers....
Cores are turning into the new megapixel
Click to expand...
Click to collapse
CoNsPiRiSiZe said:
Exaclty. I can't believe how many of my friends (iPoop users) have asked me "is that phone dual core?" Because of the iPoop 4S boasting better proccessor, then they say hmm that must make a phone powerful!
I do have somethings against Apple but still without that people always want MP and now Cores. :\
Sent from my Senseless Doubleshot using xda premium
Click to expand...
Click to collapse
pukemon said:
Exactly. Cores doesnt double performance when you move up to 2 and quadruple when you move to 4. A 300hp wont necessarily outperform a car with 200hp either.
I invented cyberspace. You're trespassing.
Click to expand...
Click to collapse
We are talking about GPUs here. GPUs solve what is called an "embarrassingly parallel" problem. Here, twice the number of cores usually does mean twice the performance.
There are no mobile games in existence that really make me care about my phone's GPU and I highly doubt there will be in the next coming year or two. I doubt my phone will ever draw my away from my PC or gaming platform at home and when I'm away the most hardware intensive game I play is Plants vs Zombies, Fruit Ninja, or Words With Friends. And that's only after I've looked all over the internets for something else to do other than game.
Sent from my Nexus S using XDA App
question.. how would you expect the higher clocked PowerVR SGX 540 perform vs the galaxy s 2?
Well it doesn't seem to matter galaxy s and nexus has same gpu at lower speed an I haven't seen game it can't handle an then add 1 gb of ram with better CPU ur all good
Sent from my SGH-T959 using XDA App
If you look at droidgamer, you'll find lots of crazy high end games are on the horizon. Most take advantage of the Tegras which currently are beasts. The G2x by LG is such beast. Tegra 2 powerhouse. Easily bests any game made today and in the future.
Tegra 3s are due out around the first part of next year but probably nothing subsidized in the US till next Christmas or early 2013.
Even still games are made to run well on popular hardware of today. Adding more power is currently overkill and only adds bonus eyecandy. I expect upcoming games to be playable on any high end phone, Nexus included.
Plus ifones are late to the dualcore game. While the GPU is nice, no ifone user will know about anything past the words they are regurgitating.
I am concerned about going from a powerful G2x to a GN. I'm getting mine through work to save on a personal cell bill, but I hope everything is at least moving in the right direction.
I'd trade my 8mp for a 5mp with an instant shutter because I have a kid I snap at all the time. Games are luxory but I do have a tegra 2 Transformer that I do most of my gaming on (outside passive games).
G2x - 2.3.7 CM7
Transformer - 3.2 Revolver OC/UV
This GPU have 4-pipe, not 1.
Diagram multi pipe:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
does the iphone4S have dual core GPU?? really?
i tought it only had a dualcore CPU....
iPOOP!
The problem with all the core nonsense for GPU's is the lack of a unified architecture. What we're comparing is similar to what we did before the Geforce8 and Radeon3000 series, in that there are individual pixel and vertex shaders. Its completely BS to call either of them "cores" if we use the same word for CPU cores. By that regard, my ATI 5850 has 1440 "cores", which makes any quad core processor seem crap, no?
As pointed out, GPU's are still in their infancy in the mobile world - hence the lack of a unified architecture *but* ARM is bringing one in their next gen Mali GPU so we should finally start to go towards a more reasonable comparison of GPU's soon. But for now, its best to ignore what "cores" a GPU has but look at the number of Pixel shaders, vertex shaders and ROP units (render output units). Essentially, PowerVR is aiming for a even balance between pixel:vertex shaders (hence why the MP2 has 8 of each) whilst Nvidia think that the major limiting factor is the pixel shaders, rather than doing complex geometry via vertex shaders, and that's why they push for a 8:4 / 12:4 ratio. Whether that's a smart idea or whether it will limit Tegra vs PowerVR, it's hard to tell yet. That being said, expect a massive change when we jump to the unified architecture of Tegra4/new Mali/new Adreno. Until then, take the "cores" with a grain of salt. The only legitimate use of the word core is with the PowerVR SGX543MP2/ MP4 where the 2 and the 4 are representative of 2 and 4 complete SGX543 GPU's stuck together, but each of these are clocked lower than an individual core, presumably for heat/power issues. And also remember, just like in the desktop world, drivers make a world of difference when looking at performance in games. It's pretty hard to see who's better at that yet either, but Nvidia might hold a better hand with their experience from the desktop/laptop market. This is probably the best time if you like observing massive evolutions in graphics tech, but its a crappy time for consumers since the next best thing comes very very quickly.
Edit: For anyone who's interested, Anand had a great write up comparing the architectures here. Some quotes to justify:
The Mali-400 isn't a unified shader architecture, it has discrete execution hardware for vertex and fragment (pixel) processing. ARM calls the Mali-400 a multicore GPU with configurations available with 1 - 4 cores. When ARM refers to a core however it's talking about a fragment (pixel shader) processor, not an entire GPU core. This is somewhat similar to NVIDIA's approach with Tegra 2, although NVIDIA counts each vertex and fragment processor as an individual core.
Click to expand...
Click to collapse
Ask the iPhone pukes if they have something that matters. Good call quality and 4g
Sent from my SCH-I400 using Tapatalk

Discussing the performance of the Tegra 3 SoC

Ipad 3's gpu, about twice the performance of the One X, but in OFFSCREEN 720p mode:
Check it out, http://glbenchmark.com/result.jsp?b...ersion=all&certified_only=1&brand=all&gpu=all
Would that mean there are no worries for game's performance? Since we have much lower resolution?
What i was thinking is that game makers who optimize their games for the new ipad, wouldn't make them run at native resolution, but something around 720p, and that would mean our One X is simply incapable of running ipad's optimized games, with less than half the performance of the ipad.
What do you think?
eeeeeee said:
Check it out, http://glbenchmark.com/result.jsp
Would that mean there are no worries for game's performance? Since we have much lower resolution?
What i was thinking is that game makers who optimize their games for the new ipad, wouldn't make them run at native resolution, but something around 720p, and that would mean our One X is simply incapable of running ipad's optimized games, with less than half the performance of the ipad.
What do you think?
Click to expand...
Click to collapse
so why not post this in the mega thread mate
hamdir said:
so why not post this in the mega thread mate
Click to expand...
Click to collapse
The mega thread is already pointless, as there is a forum for the One X,
and some subjects there are being neglected or not getting as much attention as they deserve.
http://glbenchmark.com/phonedetails.jsp?D=Apple+iPad+3&benchmark=glpro21
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
720p offscreen
Egypt iPad3 140.9
Egypt iPad2 88.8
Egypt Prime 68
Egypt One X 64
Egypt One S 50
Pro iPad3 252.1
Pro iPad2 148.8
Pro Prime 81
Pro One X 82
Pro One S 76
Standard (native resolution)
Egypt iPad3 59.9 @2048x1536
Egypt iPad2 59.6 @1024×768
Egypt Prime 46.8 @1280x800
Egypt One X 51 @1280x720
Egypt One S 57 @540×960
Pro iPad3 60 @2048x1536
Pro iPad2 60 @1024×768
Pro Prime 54 @1280x800
Pro One X 54 @1280x720
Pro One S 60 @540×960
so we are talking between 2x and 3x the T3 and not 4x like Apple claimed
iPad3 CPU same as iPad2
http://www.engadget.com/2012/03/13/new-ipad-gets-benchmarked-1gb-ram-confirmed-no-boost-in-cpu-sp/
hamdir said:
http://glbenchmark.com/phonedetails.jsp?D=Apple+iPad+3&benchmark=glpro21
720p offscreen
Egypt Ipad3 140.9
Egypt Ipad2 88.8
Egypt Prime 68
Egypt One X 64
Egypt One S 50
Pro Ipad3 252.1
Pro Ipad2 148.8
Pro Prime 81
Pro One X 82
Pro One S 76
so we are talking between 2x and 3x the T3 and not 4x like Apple claimed
Click to expand...
Click to collapse
Thanks, indeed a little more than 2x on some benchmarks, although never 3x.
Now the big question here is whether the ipad will be upscaling games or rendering in native resolution.
Another question would be, is how does upscaling look (1:4) versus native low resolution (1:1).
These details will really affect my decision whether to get the One X, since I'm not planning to buy an outclassed gpu, as upscaled games will be able to run 2x on the ipad 3, and the One X already incapable of keeping up before being released.
eeeeeee said:
Thanks, indeed a little more than 2x on some benchmarks, although never 3x.
Now the big question here is whether the ipad will be upscaling games or rendering in native resolution.
Another question would be, is how does upscaling look (1:4) versus native low resolution (1:1).
These details will really affect my decision whether to get the One X, since I'm not planning to buy an outclassed gpu, as upscaled games will be able to run 2x on the ipad 3, and the One X already incapable of keeping up before being released.
Click to expand...
Click to collapse
most game developers on twitter said the iPad3 GPU is not enough to feed the massive pixel count, so yes anticipate a lot of upscaling, Anandtech is making the same assumption
you keep forgetting that most Tegrazone T3 enhanced games used the quad cores for graphics....so the T3 standalone GPU will not keep up but the Quads will make up for it
its really not Apple to Apple comparison, the T3 is designed as a complete mobile graphics solution, Shadowgun THD and Glowball demos are real world examples
hamdir said:
most game developers on twitter said the GPU will not be enough to feed the massive pixel count, so yes anticipate a lot of upscaling, Anandtech is making the same assumption
you keep forgetting that most Tegrazone T3 enhanced games used the quad cores for graphics....so the T3 standalone GPU will not keep up but the Quads will make up for it
its really not Apple to Apple comparison, the T3 is designed as a complete mobile graphics solution, Shadowgun THD and Glowball demos are real world examples
Click to expand...
Click to collapse
But to compete with the new ipad one must optimize the game for the T3 in the code level, which is never gonna happen outside of Nvidia's Tegra Zone.
eeeeeee said:
But to compete with the new ipad one must optimize the game for the T3 in the code level, which is never gonna happen outside of Nvidia's Tegra Zone.
Click to expand...
Click to collapse
why? most games are optimised for their platforms, i didnt even think the DHD with Adreno205 can run shadowgun and it did when optimised for it
Unity developers documents reveal they optimize for every main GPU in the market
and regarding Tegrazone! its one of the best reasons to buy a T3, you have a powerhourse like Nvidia pushing devs to optimize for ti
same case with the iPad, devs have to optimize for the tile based GPU believe me keeping all those pixel inside a tile based buffer will be a major headache for iPad3 games
i really have doubts the iPhone5 will carry the 543MP4 if they want to keep parity with the iPad3 they will simply bump the speed iPhone4s CPU from 800 to 1000 and leave the old GPU
---------- Post added at 12:41 PM ---------- Previous post was at 12:39 PM ----------
if you make GPU benchmarks between Xbox 360 and Ps3 the Xbox will mope the floor withe PS3 GPU
but in real world there are many PS3 games the Xbox360 can not even make, while 90% of the time they match
Ps3 has the cores to assist its inferior GPU and Xbox360 has the GPU assisting its inferior CPU, very similar scenario here mate
in Desktop PC development they already moving to unify CPU/GPU cores
Finally
we already don't have a games match between android and apple, so whats your best choice for GPU if you skip T3? non my friend there is non, T3 is the best SOC for graphics right now on Android
hamdir said:
why? most games are optimised for their platforms, i didnt even think the DHD with Adreno205 can run shadowgun and it did when optimised for it
Unity developers documents reveal they optimize for every main GPU in the market
and regarding Tegrazone! its one of the best reasons to buy a T3, you have a powerhourse like Nvidia pushing devs to optimize for ti
same case with the iPad, devs have to optimize for the tile based GPU believe me keeping all those pixel inside a tile based buffer will be a major headache for iPad3 games
i really have doubts the iPhone5 will carry the 543MP4 if they want to keep parity with the iPad3 they will simply bump the speed iPhone4s CPU from 800 to 1000 and leave the old GPU
Click to expand...
Click to collapse
Offloading graphics to cpu cores is very unnatural, and requires hard modifications to your code.
Again I'm not a graphic expert, but it really depends on one's coding style, becuase if you code your game well enough, it will not be hard to port it over to tegra 3, however, if there's no sign of threads or any seperation of processes in your game, you will have to literally re-develope the game for the tegra 3.
Now we all know the fact that developers favor ios over android almost in any case, especially for gaming, we can expect really bad performance in my opinion in less than a year of holding the One X.
By the way the fact that tegra 3 is the best soc out there for android, kind of depresses me.
I wanted to see android smartphone manufactoreres as htc and samsung adopt the powervr solution, since it's sadly much much better than any other mobile gpu in the market.
yes but i am a graphics expert
We don't have a parity in iOS games vs Android, thanks to Tegrazone we have a lot more games
in fact Tegra3 already runs most games a lot better than on iPad2 with much higher resolution
i don't get your point
the competition is T3 vs iPad2 and not iPad3 since the extra cores will simply serve to feed more pixels (even if slightly upscaled the massive pixel count is beyond 4x) the T3 has 1.5 to 2x competition at most
if you have time read the unity development document to get a better idea
you still didn't answer me what is your alternative? buying an iOS device?
hamdir said:
yes but i am a graphics expert
We don't have a parity in iOS games vs Android
in fact Tegra3 already runs most games a lot better than on iPad2 with much higher resolution
i don't get your point
the competition is T3 vs iPad2 and not iPad3 since the extra cores will simply serve to feed more pixels, the T3 has 1.5 to 2x competition at most
if you have time read the unity development document to understand
you still didn't answer me what is your alternative buying an iOS device?
Click to expand...
Click to collapse
Just to make sure we understand each other, I'm never even thinking of buying an ios device, the os sucks, but hell it's devices have very good hardware, and have the developers optimize things for them first.
You are right that more cpu power CAN be better and therefore the tegra 3 might perform better, however, I'm really honest when I ask whether we will have the developers support for every app and game that requires performance? or our soc gets neglected,
filled with choppy and stuttering games using only two cores half the power and pushing everything to the gpu?
Just look at adobe photoshop touch, tegra 3 performs like 5 fps zooming and panning while ipad 2 is 60fps.
PowerVR tiling has its performance downhills too
an upcoming android phone with intel SOC + PowerVR SGX544 is coming so maybe you should consider that one
as for me im rushing to Nvidia and never coming back its thanks to them that we are seeing a lot more games on android
hamdir said:
PowerVR tiling has its performance downhills too
an upcoming android phone with intel SOC + PowerVR SGX544 is coming so maybe you should consider that one
as for me im rushing to Nvidia and never coming back
Click to expand...
Click to collapse
I don't think I will go for x86, but have a look at my edited previous post about adobe photoshop as an app that isn't optimized for tegra 3, and nobody really cares..
regarding Photoshop touch, i hope its a development problem and not the limited memory bandwith on T3 we should ask nivida about this
hamdir said:
regarding Photoshop touch, i hope its a development problem and not the limited memory bandwith on T3 we should ask nivida about this
Click to expand...
Click to collapse
Either one proves my point, that tegra 3 is useless without optimizations, or that tegra 3 is simply not good enough. Shame on Nvidia, what can I say.
eeeeeee said:
Either one proves my point, that tegra 3 is useless without optimizations, or that tegra 3 is simply not good enough. Shame on Nvidia, what can I say.
Click to expand...
Click to collapse
mate optimising for certain hardware is not wrong! in fact all hardware requires this, how else would you push things forward?? otherwise lets stick PowerVR and ARM in every device and call it quits so devs don't have to "optimize"
hamdir said:
mate optimising for certain hardware is not wrong! in fact all hardware requires this, how else would you push things forward?? otherwise lets stick PowerVR and ARM in every device and call it quits so devs don't have to "optimize"
Click to expand...
Click to collapse
I accidently pressed the thanks button =] although if I think it over you do deserve one after your useful posts.
Every hardware requires optimizations, but history has proven that when you have a device that your software has to be over optimized for you to gain good performance, the developer might neglect it and just prefer worse performance - we are being witnessed to exactly that case when comparing adobe photoshop touch over the platforms, and I can deliver many other examples.
ok regarding up-scaling
for sure iPad3 devs will use ups-calling but not by 4x, anything higher than 2x up-scaling will show its ugly face
in fact the only worry i have about T3 in general is not the GPU but the limited memory bandwidth but most tablets aver 720p and the T3 seems to keep up well, its also handles 1080p H264 very well
heavy games like infinity blade might suffer from the bandwidth though
but consider this, i always wanted to move to Tegra, why? because of a much better game support, you can't deny it's Tegrazone that started pushing high end 3d games to android
regarding optimisations, everyone said the same about the PS3 but with Sony backing it proved them wrong, i find it really wrong that we expect to pamper developers just because they are now comfortable with PowerVR thanks to iOS
hamdir said:
ok regarding up-scaling
for sure iPad3 devs will use ups-calling but not by 4x, anything higher than 2x up-scaling will show its ugly face
in fact the only worry i have about T3 in general is no the GPU but the limited memory bandwidth but most tablets aver 720p and the T3 seems to keep up well
heavy games like infinity blade might suffer from the bandwidth though
Click to expand...
Click to collapse
Good subject, do you have any details about games in the transformer prime, or even apps that suffer from it already?
Regarding the PS3 analogy, I honestly think it's irrelevent simply because PS3 is a permanent targeted platform, which I hope tegra 3 remains for a long period of time.
was Nvidia wrong by not shooting for the moon with its GPU and memory bandwith? yes but that's why they were the first to achieve a quad mobile SOC while Apple A6, ARMA15 and quad s4 are far away
oh and a very important point
according to unity Nvidia Tegra has the best development tools and performance analyzers, qualcomm comes second while the iPad2 tools are lacking
all we know about the Prime is that all THD games running a lot better vs iPad2
there is a reason why so many say "**** benchmarks"

Cpu benchmark .Htc onex VS Galaxy Note Vs Galaxy nexus vs galaxy SII

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
guys take a look to chart.Htc OneX is a monster.all small problems will be solved with next OTA Update .
•HTC One X : NVidia Tegra 3 1,5 GHz quadcore
•Galaxy Note : Samsung Exynos 4210 1,4 GHz dualcore
•Galaxy Nexus : Texas Instrument OMAP 4460 1,2 GHz dualcore
•Galaxy S II : Qualcomm Snapdragon APQ 8060 1,5 GHz dualcore
more info :http://www.htc-hub.com/htc/actualites/benchmark-les-samsung-galaxy-impuissants-face-au-htc-one-x/
i see HTC-Hub are back into HTC love, they quickly got over that One S problem didn't they?
apparently this bench is supplied by Nvidia itself and hence i guess we can call it hand picked
ill post it in my mega thread for completion if you don't mind
Are you high?
supendal said:
Are you high?
Click to expand...
Click to collapse
who is high? because i said handpicked? they only put it against the galaxy series
I will post new benchmark Sony s and 3 other phones soon.HTC one" is very powerful.minimum 40 percent more powerful than Sony s
That's one of my favorite things about the One X, it's speed. It simply blows away the competition no matter which benchmark software you use. I've had mine a week now and I love it, it is nice to see HTC in the lead again.
it does look awesome with the stunning result. but don't you think the other 3 devices are not the same league as HOX?
THB it's not fair running benchmark with other Dual Core devices. We have Quad Core one of it's kind. Once other manufactures releases Quad Core then we can start the real benchmark tests. Till then we don't need some stupid benchmark to tell we have the fastest device in the world.
JeremyGuan said:
it does look awesome with the stunning result. but don't you think the other 3 devices are not the same league as HOX?
Click to expand...
Click to collapse
That's what I thought when I first saw it.
SamStone said:
That's one of my favorite things about the One X, it's speed. It simply blows away the competition no matter which benchmark software you use. I've had mine a week now and I love it, it is nice to see HTC in the lead again.
Click to expand...
Click to collapse
the htc one s blows the htc one x mate
jonneymendoza said:
the htc one s blows the htc one x mate
Click to expand...
Click to collapse
THIS IS A TOTAL AND COMPLETE IGNORANT COMMENT
we have written, tested,took screenshots and posted links of a billion test, benchmark, article and scientific fact so far
NO THE SNAPDRAGON4 DUAL with ADRENO225 DOES NOT AND WILL NOT BLOW THE ONE X!!!!!! ESPECIALLY IS NOT CAPABLE TO EVEN MATCH THE ONE X @ 720P
do some proper research before you speak!
check the CPU scores in here and shut up! and again the One X is running @ 4x the pixel count! http://www.slashgear.com/htc-one-x-vs-htc-one-s-benchmarking-war-03221385/
The point is the dual core S4 is extremely close in performance to the quad core T3, while being much more power efficient. I'll take more power per core any day
Sent from my PG86100 using XDA Premium App
hamdir said:
THIS IS A TOTAL AND COMPLETE IGNORANT COMMENT
we have written, tested,took screenshots and posted links of a billion test, benchmark, article and scientific fact so far
NO THE SNAPDRAGON4 DUAL with ADRENO225 DOES NOT AND WILL NOT BLOW THE ONE X!!!!!! ESPECIALLY IS NOT CAPABLE TO EVEN MATCH THE ONE X @ 720P
do some proper research before you speak!
check the CPU scores in here and shut up! and again the One X is running @ 4x the pixel count! http://www.slashgear.com/htc-one-x-vs-htc-one-s-benchmarking-war-03221385/
Click to expand...
Click to collapse
I think you need to do some more research. Yea One S is running less pixels. But in general, Krait is faster. One S is running the slower Krait SoC also, while the One XL or AT&T and Sprint version will be running the MSM8960.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3
You need to take into account, that barely anything uses 4 cores and its more of a marketing ploy by Nvidia. Your looking at 4 cortex A-9 cores at 40-45nm vs 2 28nm Krait cores which are a brand new architecture. Krait is more in line with the (but will be slower) upcoming cortex A-15 which Tegra 4 is supposed to be running on and TI OMAP 5 will be using.
On the other hand, Krait still wiped the floor with Tegra in Linpack Multi-threaded.
Here's One XL benchmarks for supports
http://an.droid-life.com/2012/03/26/att-htc-one-xl-benchmarked-blowing-away-other-phones-already/
In real world usage, Krait will be faster with it newer generation cores vs Tegra 3's older generation. Barely anything will be optimized for 4 cores besides Tegra zone games most likely, but even then.....Krait cores will hold their ground.
The one benchmark that Tegra 3 destroys at is Antutu because it actually uses all 4-cores but isn't a good real representation of real world performance.
GPU-wise, they're about on par.....but this is a CPU thread. The Krait Pro SoC will pack the upgraded adreno GPU
exactly GPU is on par at QHD, at 720p it won't keep up and GPUs on Android are lacking more than CPUs
as for krait pro we r not talking about future announced devices here are we? my response was about One S/XL
and not just Antutu u can see that is the slashgear link I posted
hamdir said:
THIS IS A TOTAL AND COMPLETE IGNORANT COMMENT
we have written, tested,took screenshots and posted links of a billion test, benchmark, article and scientific fact so far
NO THE SNAPDRAGON4 DUAL with ADRENO225 DOES NOT AND WILL NOT BLOW THE ONE X!!!!!! ESPECIALLY IS NOT CAPABLE TO EVEN MATCH THE ONE X @ 720P
do some proper research before you speak!
check the CPU scores in here and shut up! and again the One X is running @ 4x the pixel count! http://www.slashgear.com/htc-one-x-vs-htc-one-s-benchmarking-war-03221385/
Click to expand...
Click to collapse
The very link you typed has the one x losing in several benchmarks.
You're completely wrong my friend. Not to mention the fact that all the graphics benchmarks are done offscreen (device native res doesn't matter.)
eallan said:
The very link you typed has the one x losing in several benchmarks.
You're completely wrong my friend. Not to mention the fact that all the graphics benchmarks are done offscreen (device native res doesn't matter.)
Click to expand...
Click to collapse
the only test is linpack, did you even look at the numbers? the CPU test on every other is taken by t3 in wide margins including quadrant the s4 only balances with IO scores, gpu benchs done offscreen are stated and in those adreno225 falls down
hamdir said:
exactly GPU is on par at QHD, at 720p it won't keep up and GPUs on Android are lacking more than CPUs
as for krait pro we r not talking about future announced devices here are we? my response was about One S/XL
and not just Antutu u can see that is the slashgear link I posted
Click to expand...
Click to collapse
ummmm you really should start doing some more research before sounding like your 100% correct about everything.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3/2
MSM8960 beat Tegra 3 in most GPU benchmarks? And Anandtech even said in the ones where the Resolution isn't the same which is only 1 or 2 of those benchmarks, that the Krait will still be faster than Tegra 3.
Dude your fighting a losing war.......as I said before, Krait is a generational gap compared to Tegra 3.
28nm vs 40-45nm fabrication......last years old Cortex cores vs brand new Krait cores. Tegra 3 is more of a marketing ploy and being "Quad-core". It's still mighty fast, but can't really compare to the newer generation architecture.
As you said....this is CPU
Look at the CPU scores on Quadrant, Krait has more than double the CPU score over Tegra.
pewpewbangbang said:
ummmm you really should start doing some more research before sounding like your 100% correct about everything.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3/2
MSM8960 beat Tegra 3 in most GPU benchmarks? And Anandtech even said in the ones where the Resolution isn't the same which is only 1 or 2 of those benchmarks, that the Krait will still be faster than Tegra 3.
Dude your fighting a losing war.......as I said before, Krait is a generational gap compared to Tegra 3.
28nm vs 40-45nm fabrication......last years old Cortex cores vs brand new Krait cores. Tegra 3 is more of a marketing ploy and being "Quad-core". It's still mighty fast, but can't really compare to the newer generation architecture.
As you said....this is CPU
Look at the CPU scores on Quadrant, Krait has more than double the CPU score over Tegra.
Click to expand...
Click to collapse
HTC One X - HTC's New Hero Device! Mega Information Thread
http://forum.xda-developers.com/showpost.php?p=24189921&postcount=15
http://forum.xda-developers.com/showpost.php?p=24097326&postcount=5
enough research for you?
and in here are you even looking? QUADARANT CPU score is more on the One X and your link your showing MDP performance which is known to be elusive, i'm talking about real world One S vs One X results!
11269 CPU One X vs 8574 CPU for the One S in quadrant, yes IO is making up for it on the s4
I have no doubt in my mind that a quad s4 with adreno 3 will mope the floor with t3 but its not here and the dual s4 matches and can not beat the GPU or multi threading on t3 and hence both r good for HTC fans, one x vs xl, t3 will win but at the cost of less effiecency
TBH, I am not interested in either of them if they can't even match (forget about beating) SGX543MP2 in iPad2/iPhone4S. Even though they are a generation-next compared to that, they still fail by a big margin. nvidia is simply selling them on the basis of brand created from desktop products. Nobody heard about T(1 series), T2 was a dud, I don't have too much hope for T3 either. Let's see how it stakes up when compared against 2012 SoC(s). We will know when Exynos, TI, ST-E (2012 SoCs) are compared against T3. None of them are out yet, but we sure will see them soon.
All around Krait is still better CPU-wise.......less energy, heat, etc...
I'd take 2 newer generation cores any day over 4 old ones, which aren't even being really used.
And yea, with all the competition set to release their flagships and SoC soon.....Tegra 3 is going to get run over.
Will have to wait till Tegra 4 and 4 cortex-A15 as well as supposed Keplar GPU to become a powerhouse. Too much of a marketing ploy right now and being the first "Quad-core"

Categories

Resources