[Q] HTC Desire HD or Desire Z? - Desire Q&A, Help & Troubleshooting

So guys I was just wondering... which one do you prefer between the two? Which one you'll get when it's released? For me I am leaning towards the HTC Desire HD because of the bigger screen.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Desire HD specs
Android 2.2
4.3″ capacitive touchscreen
8 MP camera with 720p HD video capture
WiFi
HSDPA
GPS
Accelerometer
4 GB internal memory expandable via microSD card slot

I had an HD2 and you can't operate it in one hand, it's just too big. Having said that, it does look awesome
I'm happy with my Desire, though. Something better will always come along and I'm not due an upgrade for about 18 months anyway!!

Yeah I think that would be an issue with me too... the bigger the screen the harder to operate it in one hand... but I want a bigger screen for browsing the web really, reading, cheking email and stuff. I just wished they put a kickstand on that Desire HD cause I also love watching youtube on my phone. I'm also not into keyboard phones that's why I still want the HD than the Z.

i need a keayboard. screen size doesnt matter too much.

I would really really want to buy the HD, since stepping down from my Blackstone screen real estate to the Desire was a bit of a shock and I have gorilla-size hands.
Having seen it has TFT instead of S-LCD, I'm not getting one and waiting for other devices, maybe with S-AMOLED

The HD has my vote, but I wont turn away from the stock Desire. Better screen, and when they drop a system dump we'll have the new sense too
Sent from my HTC Desire

HD has bigger screen but same pixels 480x800 as the desire. (tell me if I am wrong). So what does it makes it HD? bigger screen, same pixels means less pixels per inch, means lower quality!
I don't like companies who copy others ideas but I believe that Sony and Samsung (the screen makers of SLCD and AMOLED) should follow Apple and increase pixels per inch in their displays.
Desire (the first one) is perfect in any way and these new members of the desire family look great but have almost the same specifications with the Desire that was released half a year ago. I was waiting for a 2 core processor and a much better screen. I am a bit dissapointed with the hardware of these brilliant new phones... And I would be much more dissapointed if I see better hardware in the Windows Mobile phones...

or maybe they are saving the hardware to be able to sell android 3 that is on his way....

fotisp88 said:
I was waiting for a 2 core processor and a much better screen.
Click to expand...
Click to collapse
I do wonder what the preoccupation is with dual cores on a mobile phone! As far as I'm concerned, the Desire is fast enough as it is.
Personally, I'd rather see more time and investment on improving battery technologies.
Regards,
Dave

foxmeister said:
Personally, I'd rather see more time and investment on improving battery technologies.
Regards,
Dave
Click to expand...
Click to collapse
This right here...

foxmeister said:
I do wonder what the preoccupation is with dual cores on a mobile phone! As far as I'm concerned, the Desire is fast enough as it is.
Personally, I'd rather see more time and investment on improving battery technologies.
Regards,
Dave
Click to expand...
Click to collapse
+1
Although it would be nice to see some improvement in 3D performance but still, not at the cost of battery life.

more cores means faster performance, greater energy efficiency, and more responsive multitasking. And yes Desire is fast as is, but I believe the 1ghz clock is not so battery friendly. Also any new technology is always welcome isn't it?

fotisp88 said:
more cores means faster performance, greater energy efficiency, and more responsive multitasking.
Click to expand...
Click to collapse
It might mean one of those, but it seldom means all three at the same time!
2 x cores at 500MHz might well be more energy efficient than a single 1GHz core, but it is not always going to be the case as it is dependent on the complexity of the core itself. It might also give you more responsive multitasking, but often this is going to be at the expense of the performance of a single task.
Given the limited screen real-estate on a phone, there is generally little need to run more than a handful of tasks simultaneously, as usually only one task will be able to interact with the user (i.e. have focus), and so what you really need is better and faster task switching which requires more memory, not more cores.
Looking beyond the phone to tablets, I can see that changing, but this is a conversation about phones!
My "ideal" phone would be something with the form factor of the Desire, but the processor, memory, and internal storage from the Desire HD, an S-AMOLED screen, and a 2 day battery life with reasonably heavy usage.
HTC are almost there - it's just the batteries that let them down IMHO!
Regards,
Dave

imagine a dual core processor, 500MHz each meaning it's more energy efficient, BUT, with the ability to overclock each processor up to 1GHz... *drools*....

cgrec92 said:
imagine a dual core processor, 500MHz each meaning it's more energy efficient, BUT, with the ability to overclock each processor up to 1GHz... *drools*....
Click to expand...
Click to collapse
Dual-core is not inherently more energy efficient than a single core, and most gains are because it can deliver similar performance at a lower clock speed compared to a single core at a higher clock.
So by overclocking 1GHz you've just got rid of any additional energy efficiency that it might have had!
Regards,
Dave

I place the need for a keyboard as a very high requirement , so its the Desire Z for me

Have to agree with Dave on the battery issue, on another note I played around with both devices last night at the HTC london meet and must admit apart from a bit of optomisation of the hardware its about the same.. HD larger screen, Z slide out keypad.... and thats it, in addition to the improved sense interface etc, I am happy with my desire at present but must admit I was impressed with the shut down and boot up speeds however

I'd rather have my old G1 than the Desire Z if I wanted a hard keyboard.
Sent from my HTC Desire

So guys...First things first,I'll (try to) go with the HD(might be just too expensive,you know?).
Secondly,the hardware is much better than our current Desire's.The CPU is made with the 45nm process,which means better power efficiency and better performance per CPU clock.Not to mention the extra ram and the fact that the GPU is about 4 times more powerful(maybe even more) than our current Desire's.The screen might not be AMOLED,but SLCD(not simple TFT as mentioned somewhere around here) and 800x480 is more than enough for such a screen.I mean,what do you want?If you want HD buy some HDTV.End of story!
Don't get me wrong,I absolutely love my Desire and I won't swap it at any case,I will keep it as my secondary phone if I get the HD.

I don't know about the HD naming thing... but if I have to guess I think it's not about an HD screen but an HD camera or something cause the new Desire HD has an 8 megapixel cam with 720p recording... rather than the 5 megapixel of the old Desire. I don't know about the 45 nanometer processor if that is also true, cause I haven't read any detailed specification for the Desire HD, I've only read that it has the Qualcomm 8250 SnapDragon CPU which is the same as the old Desire.
I know that the Desire Z's CPU is new, cause it has the Qualcomm MSM 7230 800Mhz, also known as the Scorpion CPU, and they said that this is supposed to be faster than the 1Ghz snapdragon cause the Qualcomm MSM has the new Adreno 205 integrated GPU in it and supposed to be better in 3D games. Also because of lower clock speed the more energy efficient it's supposed to be. That's why it is also used in the new Google G2 android device.
Actually I started this thread because I want to get another android device. I gave my old Desire to my wife cause she loved it very much after she have tried using it for a day... and I can't say no to the missis... if you know what I mean hehehehe.

Related

HTC HD7 Specs

same exact chipset as the HD2... so does this mean....
what is your take on this?
We just went from no photos of the HTC HD7 to several live photos from several different sources. Some specs of the Windows Phone 7 powered behemoth were unofficially confirmed and a price too (again).
Specs and renders of the HTC HD7
The specs of the HTC HD7 include a 4.3" WQVGA touchscreen, a 1GHz QSD8250 Snapdragon CPU and a 5MP camera with dual-LED flash and 720p video recording. There's 8GB of built-in storage, though the situation with the microSD card slot is unclear - there are contradicting reports (HTCInside.de says "yay", WMPowerUser says "nay").
Live shots of the HD7 and its UI, tweaked by HTC
Update: The full specs for the HTC HD7 leaked. It's just 11.2mm thick, 576MB RAM and T-Mobile USA 3G bands. There won't be a microSD card slot. Check out the full specs below.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Anyway, the photos of the HTC HD7's back reveal two interesting things - a kickstand and an "HD3" label (we're guessing that was the original name before HTC changed it to emphasize on Windows Phone 7).
The HTC HD7 has a kickstand
A new bit of information suggests that the HTC HD7 will retail on T-Mobile and a 560 euro (750 USD) figure is mentioned as price. That's not too far off from the 599 euro price for O2 Germany from Tuesday. All this info is from rumor-land so take it with a pinch of salt.
Source: HTCInside.de (Site in German), WMPoweruser, Mobile01 (Site in Chinese), Update: WMPoweruser
Why would HTC use the same wack (well now it is) CPU when they are already releasing second generation Snapdragons? This is especially disappointing when you consider that WP7 is supposed to be a strong gaming platform and the original Snapdragon's GPU sucks.
It has a pretty skimpy battery, no? Isn't 1200mAh kinda low for a high powered smartphone?
Award Tour said:
Why would HTC use the same wack (well now it is) CPU when they are already releasing second generation Snapdragons? This is especially disappointing when you consider that WP7 is supposed to be a strong gaming platform and the original Snapdragon's GPU sucks.
Click to expand...
Click to collapse
Why did HTC release the Diamond2 with the same internals as the original Diamond? Cost, development times, the fact that it's likely not to make a huge amount of difference?
None of the games we've seen demoed have been short of power, and given the time it takes to develop a new hardware phone platform (a year or so), it's not too surprising they're using slightly older hardware. WP7 has only been in development for about 18 months. It's only been with OEMs for 12 months or so (in a basic form) and probably only 9 months or so for actual hardware development. You don't want to build a new phone with a new OS, with no experience with either, so it makes sense to use hardware you're familiar with. The HTC Desire Z with the new QSD8255 has probably been in development longer than the HD3/7.
QSD8250s are not underpowered, and considering the fact that WP7 doesn't allow third party developers to multitask, it's definitely up for the task.
BS??
Why would t-mobile release there first window seven phone and not put it on there 4G sever? i just dont believe this it sad sad sad the whole spec list is sad. I dont believe this would be the spec of the first WM7 phone it just to sad and way to early to say. But it just a rumor :O
This is a complete joke, I consider this a downgrade not upgarde.
I can guarantee that I wont be getting a ****ty HD3, 8GB max storage my arse, what possible justification can M$ have to not allow storage card? I dont get it, does anybody know the reason as I cant think of a single reason why they would put this limitation on WP7 phones.
l3v5y said:
Why did HTC release the Diamond2 with the same internals as the original Diamond? Cost, development times, the fact that it's likely not to make a huge amount of difference?
None of the games we've seen demoed have been short of power, and given the time it takes to develop a new hardware phone platform (a year or so), it's not too surprising they're using slightly older hardware. WP7 has only been in development for about 18 months. It's only been with OEMs for 12 months or so (in a basic form) and probably only 9 months or so for actual hardware development. You don't want to build a new phone with a new OS, with no experience with either, so it makes sense to use hardware you're familiar with. The HTC Desire Z with the new QSD8255 has probably been in development longer than the HD3/7.
QSD8250s are not underpowered, and considering the fact that WP7 doesn't allow third party developers to multitask, it's definitely up for the task.
Click to expand...
Click to collapse
The first and last parts are so wrong that I won't even tell why, it's too obvious.
And yes the Q8250 isn't underpowered, it's pure crap graphics wise.
Gogo Samsung Cetus i917 with Hummingbird and SGX540, what a shame I liked the design of the HTC oh well, I'm not the only one.
l3v5y said:
...the fact that it's likely not to make a huge amount of difference?
Click to expand...
Click to collapse
The difference in GPU performance between the first and second generation Snapdragons (Adreno 200 vs 205) IS huge.
I own a EVO with the older GPU and the gaming performance is lousy compared to the current gen of high end mobile GPU's. Of course I have no doubt that WP7 will handle gaming a bit better than Android but it doesn't change the fact that it lags waaaay behind what both competitors and HTC itself is now offering. And like I said, a huge draw to WP7 is Xbox Live, so why skimp on the GPU?
Award Tour said:
The difference in GPU performance between the first and second generation Snapdragons (Adreno 200 vs 205) IS huge.
I own a EVO with the older GPU and the gaming performance is lousy compared to the current gen of high end mobile GPU's. Of course I have no doubt that WP7 will handle gaming a bit better than Android but it doesn't change the fact that it lags waaaay behind what both competitors and HTC itself is now offering. And like I said, a huge draw to WP7 is Xbox Live, so why skimp on the GPU?
Click to expand...
Click to collapse
i heared from a source that the graphics power that appear on our hd2 is not the really power due to the lack of drivers , and with wp7 that supports directx 9 the graphics will be improved allot more than what we see on our hd2 , and here is the proof of what i say , look at these videos running on htc mozart with same specifications , look at how the weather animations is smooth http://www.engadget.com/2010/09/16/i...ndows-phone-7/ the second video , by the way the 8250 processor and the 8gb memory is a limitation by microsoft for the first generation devices htc has nothing to do with that so don't flame them
hoss_n2 said:
i heared from a source that the graphics power that appear on our hd2 is not the really power due to the lack of drivers , and with wp7 that supports directx 9 the graphics will be improved allot more than what we see on our hd2 , and here is the proof of what i say , look at these videos running on htc mozart with same specifications , look at how the weather animations is smooth http://www.engadget.com/2010/09/16/i...ndows-phone-7/ the second video , by the way the 8250 processor and the 8gb memory is a limitation by microsoft for the first generation devices htc has nothing to do with that so don't flame them
Click to expand...
Click to collapse
Another worker from HTC has appeared.
"the graphics power that appear on our hd2 is not the really power due to the lack of drivers"
Well the Android based devices with the 8250 doesn't lack drivers and performs crap compared to Adreno 205, SGX5x0 and Tegra 2.
" is a limitation by microsoft for the first generation devices htc has nothing to do with that so don't flame them"
It's the minimum requirement.
TheATHEiST said:
This is a complete joke, I consider this a downgrade not upgarde.
I can guarantee that I wont be getting a ****ty HD3, 8GB max storage my arse, what possible justification can M$ have to not allow storage card? I dont get it, does anybody know the reason as I cant think of a single reason why they would put this limitation on WP7 phones.
Click to expand...
Click to collapse
MS said it on youtube viedo: They want that you can get apps only from marketplace and you can not "hack" them to your memory card. Something like that. And something about Zune content restrictions...
Sorry for the long post, can't be bothered replying lots of times!
BeEazy10 said:
Why would t-mobile release there first window seven phone and not put it on there 4G sever? i just dont believe this it sad sad sad the whole spec list is sad. I dont believe this would be the spec of the first WM7 phone it just to sad and way to early to say. But it just a rumor :O
Click to expand...
Click to collapse
Because 4G isn't really available most places (and LTE is NOT 4G, it's just marketing rubbish). 4G will be big, and is better than 3.5G, but it's just not available enough (especially since the US has only recently decided 3G would be a good idea).
TheATHEiST said:
This is a complete joke, I consider this a downgrade not upgarde.
I can guarantee that I wont be getting a ****ty HD3, 8GB max storage my arse, what possible justification can M$ have to not allow storage card? I dont get it, does anybody know the reason as I cant think of a single reason why they would put this limitation on WP7 phones.
Click to expand...
Click to collapse
It's not a maximum size, and there will be devices with more than 8GB on board, wait until more than one device has had provisional specs leaked before jumping to conclusions. The lack of user replaceable storage is to protect users unfamiliar with the platform, and smartphones in general, it's not as powerful an OS as WM6.X in terms of out of box things you can do, but the UX is much much better. If you don't like the hardware/software, stick with WM6.5, it's going to be supported and available for a few years yet, as it's still used in lots of places.
Mr.Sir said:
The first and last parts are so wrong that I won't even tell why, it's too obvious.
And yes the Q8250 isn't underpowered, it's pure crap graphics wise.
Gogo Samsung Cetus i917 with Hummingbird and SGX540, what a shame I liked the design of the HTC oh well, I'm not the only one.
Click to expand...
Click to collapse
No WP7 device will have any hardware other than the QSD8250 SoC in the first round of devices. For the use in WP7, it's far from underpowered, and in every video I've seen (and from briefly playing with a device) it does everything needed, including very intensive graphics processing. Complain when you've tried it, not before.
Award Tour said:
The difference in GPU performance between the first and second generation Snapdragons (Adreno 200 vs 205) IS huge.
I own a EVO with the older GPU and the gaming performance is lousy compared to the current gen of high end mobile GPU's. Of course I have no doubt that WP7 will handle gaming a bit better than Android but it doesn't change the fact that it lags waaaay behind what both competitors and HTC itself is now offering. And like I said, a huge draw to WP7 is Xbox Live, so why skimp on the GPU?
Click to expand...
Click to collapse
The only device with a more recent CPU/SoC is the Desire Z, and we've no real proof it's any better. From everything that's been shown off, WP7 test devices on pre-release hardware/software are fast enough for very heavy gaming. WP7 handles graphics very well.
hoss_n2 said:
i heared from a source that the graphics power that appear on our hd2 is not the really power due to the lack of drivers , and with wp7 that supports directx 9 the graphics will be improved allot more than what we see on our hd2 , and here is the proof of what i say , look at these videos running on htc mozart with same specifications , look at how the weather animations is smooth http://www.engadget.com/2010/09/16/i...ndows-phone-7/ the second video , by the way the 8250 processor and the 8gb memory is a limitation by microsoft for the first generation devices htc has nothing to do with that so don't flame them
Click to expand...
Click to collapse
Firstly, there is no 8GB limitation.
DirectX 9, and excellent XNA support are big pluses of WP7, but the HD2 is probably fully utilising the hardware. WP7 is very good at handling graphics generally, and from what we've seen, it is more than capable on current hardware.
Mr.Sir said:
Another worker from HTC has appeared.
"the graphics power that appear on our hd2 is not the really power due to the lack of drivers"
Well the Android based devices with the 8250 doesn't lack drivers and performs crap compared to Adreno 205, SGX5x0 and Tegra 2.
" is a limitation by microsoft for the first generation devices htc has nothing to do with that so don't flame them"
It's the minimum requirement.
Click to expand...
Click to collapse
first i don't work with Htc iam still 19 years old and iam studying mechanical design
second the graphics drivers on android are not good (chech your info) with some tweaks we got 33 fps on neocore , while stock only scores 26 fps
3rd all wp7 devices on release date will have common processors as windows is developed to work on this processors another tybes will come later after that .
and if you see the video i posted , you will know that hd2 processor graphics are enough , have you seen the weather animations i posted in the video , compare them to android and wm6.5 and you will understand what i mean as microsoft adds directx 9 drivers which improves performence allot
So, HD3/7 = "HD2 Windows Phone 7 Version"
It's really disappointing to think that HTC has voluntarily built the HD2 with some details to have an inconsistency with WP7.
l3v5y said:
Sorry for the long post, can't be bothered replying lots of times!
DirectX 9, and excellent XNA support are big pluses of WP7, but the HD2 is probably fully utilising the hardware. WP7 is very good at handling graphics generally, and from what we've seen, it is more than capable on current hardware.
Click to expand...
Click to collapse
but i have noticed big improvements on graphical power on wp7 devices with similar gpu , so i think wm6.5 is not pushing the graphics on hd2 to its max
l3v5y said:
Sorry for the long post, can't be bothered replying lots of times!
Because 4G isn't really available most places (and LTE is NOT 4G, it's just marketing rubbish). 4G will be big, and is better than 3.5G, but it's just not available enough (especially since the US has only recently decided 3G would be a good idea).
It's not a maximum size, and there will be devices with more than 8GB on board, wait until more than one device has had provisional specs leaked before jumping to conclusions. The lack of user replaceable storage is to protect users unfamiliar with the platform, and smartphones in general, it's not as powerful an OS as WM6.X in terms of out of box things you can do, but the UX is much much better. If you don't like the hardware/software, stick with WM6.5, it's going to be supported and available for a few years yet, as it's still used in lots of places.
No WP7 device will have any hardware other than the QSD8250 SoC in the first round of devices. For the use in WP7, it's far from underpowered, and in every video I've seen (and from briefly playing with a device) it does everything needed, including very intensive graphics processing. Complain when you've tried it, not before.
The only device with a more recent CPU/SoC is the Desire Z, and we've no real proof it's any better. From everything that's been shown off, WP7 test devices on pre-release hardware/software are fast enough for very heavy gaming. WP7 handles graphics very well.
Firstly, there is no 8GB limitation.
DirectX 9, and excellent XNA support are big pluses of WP7, but the HD2 is probably fully utilising the hardware. WP7 is very good at handling graphics generally, and from what we've seen, it is more than capable on current hardware.
Click to expand...
Click to collapse
No proof the adreno 205 is better? The benchmarks are out and the desire HD crushes the old adreno by a lot.
Sent from my PC36100 using Tapatalk
I will pass this is the same as my hd2
hoss_n2 said:
but i have noticed big improvements on graphical power on wp7 devices with similar gpu , so i think wm6.5 is not pushing the graphics on hd2 to its max
Click to expand...
Click to collapse
Still doesn't change the fact that HTC uses a 2 year old CPU when there are newer ones available. The newer ones are better and more efficient aswell.
It's okay to think "the older one fits the job, just not as good, but's it's cheaper" for a cheap device but not about their upcoming flagship, this is embarrassing.
To General!
~~Tito~~
HD3 sucks!
I was really looking forward to the HD3 but now based on looks/specc I really changed my mind (which also saves me about £500).
And this is for many reasons like:
Looks like **** - especially the back cover
Camera is 5MP, HD2 has 8MP
8GB internal storage and no microSD card (I currently use a 16GB microSD card on th HD2)
iPhone 4 has a better specc than this. Actually all phones have a better specc than this.
TFT Screen. HD2 uses AMOLED.
Not to mention all the restrictions on WP7. All people that know me call me an M$ funboy but I guess I will shock everyone when I get an iPhone 4.

[SGH-T989] Qualcomm based SGS2 "Celox" aka "Hercules"

So this is interesting. There was a lot of confusion about T-Mobile U.S.' new phone the Hercules and whether or not it was an SGS2 variant. Well it is and it isn't. This link talks about a SGS2 version launching in Korea and Germany that uses the same Qualcomm SoC as the Sensation. That's an interesting choice because the Sensation does poorly on benchmarks. Other than being LTE equipped it's the same specs and looks the same at the T-Mobile U.S. Hercules. So apparently Samsung's being pretty liberal with what they define as a SGS2.
http://sammyhub.com/2011/08/09/is-this-samsung-galaxy-s-ii-lte-phone-codenamed-celox/
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I'm not sure I would be interested in this. It's a larger screen (4.5" is too big) and it will probably get worse battery life with LTE. I think I'll wait for the Galaxy S III.
smartbot said:
I'm not sure I would be interested in this. It's a larger screen (4.5" is too big) and it will probably get worse battery life with LTE. I think I'll wait for the Galaxy S III.
Click to expand...
Click to collapse
The 8060 SoC sort of sucks in the Sensation. That would kill it for me before the 4.5" screen. It's actually using the same chip that's in the HP TouchPad. Interesting choice on Samsung's part. The radio is market specific so LTE won't be everywhere.
It looks like Samsung is trying to cash in on galaxy s name and push as many phones as they can.
Probably short term business decision regardless of consequences to its name.
BarryH_GEG said:
So this is interesting. There was a lot of confusion about T-Mobile U.S.' new phone the Hercules and whether or not it was an SGS2 variant. Well it is and it isn't. This link talks about a SGS2 version launching in Korea and Germany that uses the same Qualcomm SoC as the Sensation. That's an interesting choice because the Sensation does poorly on benchmarks. Other than being LTE equipped it's the same specs and looks the same at the T-Mobile U.S. Hercules. So apparently Samsung's being pretty liberal with what they define as a SGS2.
http://sammyhub.com/2011/08/09/is-this-samsung-galaxy-s-ii-lte-phone-codenamed-celox/
Click to expand...
Click to collapse
Actually the Sensation chip sin't nearly as crappy as it was prior to getting s-off. Once developers were able to make kernel mods and other tweaks the chip performs much better than it did out of the box. I think some of the poor benchmark scores can be attributed to the qHD screen of the Sensation. However, I ran cf-bench last night with both my sgs2 and sensation clocked at 1.5ghz and the Sensation beat it each time. The gpu of the adreno 220 is surprisingly good. I would be interested to see the qualcomm chip properly implemented such that the hardware and software were coded in sync
jlevy73 said:
Actually the Sensation chip sin't nearly as crappy as it was prior to getting s-off. Once developers were able to make kernel mods and other tweaks the chip performs much better than it did out of the box. I think some of the poor benchmark scores can be attributed to the qHD screen of the Sensation. However, I ran cf-bench last night with both my sgs2 and sensation clocked at 1.5ghz and the Sensation beat it each time. The gpu of the adreno 220 is surprisingly good. I would be interested to see the qualcomm chip properly implemented such that the hardware and software were coded in sync
Click to expand...
Click to collapse
If the same chip performs significantly faster in Samsung's implementation the Sensation folks are going to be pissed. I'd also imagine Samsung will do a much better job with video drivers so it'll support tons more formats than the Sensation. With all the rumors about Tegra being the alternate due to Exynos shortages it's interesting they went with Qualcomm.
I'd like my SII to have a 4.5" screen and the back cover of this phone. That's all.
I'm not real smart on this aspect of this technology (chips and perfrormance), but I really have doubt that the benchmarks accurately reflect real world performance.
I have an SGS2 and say take my Evo 3D, turn on hotspot, run my SGS2 off the Evo, and do a Speedtest app test on each; the Evo measures 7-9M's, the SGS2 runs 3ish.
I immediately run a series of graphic heavy site's simultaneously and the SG will finish quicker every time.
I'll run a comparison of Thunderbird on the almighty Verizon LTE.
The Speedtests are Th:19-21 lol, SGS (Hotspot... ting to Wimax) measuring 3ish, and and AGAIN,when it comes to site downloads SGS2 is just faster (more marginally).
That said, the Sensation was a disappointment. Makes sense to me that it wasn't all the chips fault.
But...all that said,gut tells me, those enjoying the SGS2 like I have are gonna be up for a let down in performance with Herc.
Hope I'm wrong. Been anticipating it myself.
rockky said:
I'm not real smart on this aspect of this technology (chips and perfrormance), but I really have doubt that the benchmarks accurately reflect real world performance.
I have an SGS2 and say take my Evo 3D, turn on hotspot, run my SGS2 off the Evo, and do a Speedtest app test on each; the Evo measures 7-9M's, the SGS2 runs 3ish.
I immediately run a series of graphic heavy site's simultaneously and the SG will finish quicker every time.
I'll run a comparison of Thunderbird on the almighty Verizon LTE.
The Speedtests are Th:19-21 lol, SGS (Hotspot... ting to Wimax) measuring 3ish, and and AGAIN,when it comes to site downloads SGS2 is just faster (more marginally).
That said, the Sensation was a disappointment. Makes sense to me that it wasn't all the chips fault.
But...all that said,gut tells me, those enjoying the SGS2 like I have are gonna be up for a let down in performance with Herc.
Hope I'm wrong. Been anticipating it myself.
Click to expand...
Click to collapse
You're basically comparing LTE (VZW), Wi-Max (Sprint), and HSPA+ (AT&T) which has nothing to do with the phone's processor. Play HD videos on the Sensation/E3D (Qualcomm) and SGS2 (Exynos) and you'll be quite surprised at the difference in real world performance. jlevy73 brings up an interesting point in that devs seem to be getting better perfromance out of the Sensation now that it's unlocked than HTC was able to. But devs are still dependent on the drivers provided by the OEMs so the Qualcomm chip on HTC phones might still end up having a real world performance deficit no matter how much dev support it gets.
BarryH_GEG said:
You're basically comparing LTE (VZW), Wi-Max (Sprint), and HSPA+ (AT&T) which has nothing to do with the phone's processor. Play HD videos on the Sensation/E3D (Qualcomm) and SGS2 (Exynos) and you'll be quite surprised at the difference in real world performance. jlevy73 brings up an interesting point in that devs seem to be getting better perfromance out of the Sensation now that it's unlocked than HTC was able to. But devs are still dependent on the drivers provided by the OEMs so the Qualcomm chip on HTC phones might still end up having a real world performance deficit no matter how much dev support it gets.
Click to expand...
Click to collapse
OK...but still don't totally understand. Processors aren't a factor in how fast data is transmitted?
rockky said:
OK...but still don't totally understand. Processors aren't a factor in how fast data is transmitted?
Click to expand...
Click to collapse
The incoming data isn't coming in fast enough to tax the processor. Testing something locally on the phone like video, flash-based web pages, and running multiple apps are a better test of a processors performance. Software and drivers make a big difference too. The browser on the SGS2 is hardware optimized where the Sensation/E3D's are not and it shows in everyday use.
BarryH_GEG said:
The incoming data isn't coming in fast enough to tax the processor. Testing something locally on the phone like video, flash-based web pages, and running multiple apps are a better test of a processors performance. Software and drivers make a big difference too. The browser on the SGS2 is hardware optimized where the Sensation/E3D's are not and it shows in everyday use.
Click to expand...
Click to collapse
Thanks. Good to know that.
BarryH_GEG said:
The incoming data isn't coming in fast enough to tax the processor. Testing something locally on the phone like video, flash-based web pages, and running multiple apps are a better test of a processors performance. Software and drivers make a big difference too. The browser on the SGS2 is hardware optimized where the Sensation/E3D's are not and it shows in everyday use.
Click to expand...
Click to collapse
Suffice to say then that the US devices will suffer some in the performance debt if the Qualcomms are employed vs
Vs the Exynos??
rockky said:
Thanks. Good to know that.
Click to expand...
Click to collapse
lol 3G/4G is like your internet connection on PC has nothing to do with how powerful the CPU is.
nraudigy2 said:
lol 3G/4G is like your internet connection on PC has nothing to do with how powerful the CPU is.
Click to expand...
Click to collapse
That's true, but how fast a page renders, especially one heavy in javascript and flash does provide an insight into the cpu/gpu. I have the crapbolt, I mean Thunderbolt and LTE absolutely flies (i.e. 30mb/down). With my SGS2 on AT&T's network I get about 5mb/down. If I load up androidcentral.com (which is very heavy on graphics, flash, etc) the SGS2 renders the page 2-3X faster than my Thunderbolt. You can see the rat in the cage processor of the Thunderbolt choking to render all those graphics.
jlevy73 said:
Actually the Sensation chip sin't nearly as crappy as it was prior to getting s-off. Once developers were able to make kernel mods and other tweaks the chip performs much better than it did out of the box. I think some of the poor benchmark scores can be attributed to the qHD screen of the Sensation. However, I ran cf-bench last night with both my sgs2 and sensation clocked at 1.5ghz and the Sensation beat it each time. The gpu of the adreno 220 is surprisingly good. I would be interested to see the qualcomm chip properly implemented such that the hardware and software were coded in sync
Click to expand...
Click to collapse
Unfortunately....
qhd dosent have much to do with it.
the gpu on the SD gets OC'd as the cpu is oc.
the Ex is a more robust processor + Mali is a more powerful GPU.
Maedhros said:
Unfortunately....
qhd dosent have much to do with it.
the gpu on the SD gets OC'd as the cpu is oc.
the Ex is a more robust processor + Mali is a more powerful GPU.
Click to expand...
Click to collapse
The sensations qhd screen has 35% more pixels than the samsung s ii. It has a significant impact on processor work load and benchmarks. At 1.2GHz the Exynos 4210 is much better than the Qualcomm 8060, but at 1.5Ghz the Qualcomm will outperform an Exynos at 1.2Ghz.
FishTaco said:
The sensations qhd screen has 35% more pixels than the samsung s ii. It has a significant impact on processor work load and benchmarks. At 1.2GHz the Exynos 4210 is much better than the Qualcomm 8060, but at 1.5Ghz the Qualcomm will outperform an Exynos at 1.2Ghz.
Click to expand...
Click to collapse
Sadly, it doesn't. Maybe if you removed vcore it would but I have both devices and even clocked to 1.7, the Sensation cannot match the SGS2 in any benchmark I tried except cf-bench.
FishTaco said:
The sensations qhd screen has 35% more pixels than the samsung s ii. It has a significant impact on processor work load and benchmarks.
Click to expand...
Click to collapse
But if that is the case, then there isn't much excuse for the fact almost every Tegra 2 device with qhd displays out there, 1ghz, has beaten the Qualcomm chip in the sensation on every review I have watched. The Tegra devices use qhd's and yet are clocked lower the the sensations, yet out performs it significantly. Further more, about the pixels, the sensation only displays most quadrants on the 480*800 pixels anyway, because for some reason quite applications aren't scaled propyl for example in quadrant benchmark. Also, because most bechmarks count frames on 2d/3d graphics to help sus speed, I often find my galaxy s2 always hovers around 60fps. Thats because it has been limited to that by Samsung, so the true bechmark speed of that galaxy s2 is higher than what is show on stock firmware.
danielsf said:
But if that is the case, then there isn't much excuse for the fact almost every Tegra 2 device with qhd displays out there, 1ghz, has beaten the Qualcomm chip in the sensation on every review I have watched. The Tegra devices use qhd's and yet are clocked lower the the sensations, yet out performs it significantly. Further more, about the pixels, the sensation only displays most quadrants on the 480*800 pixels anyway, because for some reason quite applications aren't scaled propyl for example in quadrant benchmark. Also, because most bechmarks count frames on 2d/3d graphics to help sus speed, I often find my galaxy s2 always hovers around 60fps. Thats because it has been limited to that by Samsung, so the true bechmark speed of that galaxy s2 is higher than what is show on stock firmware.
Click to expand...
Click to collapse
Yes qhd makes a difference. Look at PC's as an example.
Have you seen what a SGS2 does with Tegra 2? You'll be surprised.
Sensation is not even A9 cortex based, can't compete with the rest of dual-cores.

Sensation XE or Galaxy Nexus ??

What you guys think is the best smartphone to buy at the moment sensation XE or Galaxy Nexus ???
I am a desire user and was planing to sensation XE but now that the Galaxy Nexus is announced I am bit confused.. However, I really like the new nexus and now I am thinking of buying Galaxy nexus. Reason for that is the UI is now getting very similar to htc sense and new hardware features such as the HD screen, android beam etc. I will make my finally decision after reading nexus reviews.
What you guys say?? Which one you will choose ?
I´d rather wait a little longer (a few weeks or so) and see what other devices will get an ICS update.
This would make some other current devices much more appealing.
E.g. those Motorola ones.
HTC might not update their new devices with ICS as it might slow down their phones with HTC Sense on top of it...
Personally I wouldn't want either; holding my Atrix firmly in one hand, I can just about reach the top left and bottom left corners of the touchscreen with my thumb. That would mean on a device like this I'd have to use it two handed... Why are high-end phones getting so giant these days?!
But given the choice between the two, I'd take the Sensation XE. I suspect the Galaxy Nexus will suffer rather badly in 3D performance as a result of using an SGX 540 (albeit one paired with dual-channel memory) for such a high-res screen. We know for sure it's going to get walked all over by the iPhone 4S with its 960x640 screen w/SGX543MP...
Depending on when current-gen Tegra 2/OMAP 4/Exynos/Snapdragon S3 devices get ICS updates, they may prove to be a better choice for that reason alone. This time next year, 1280x720 will probably be more viable. I suspect A5 is the only SoC with the grunt right now. Still, I could be proven wrong.
Edit: This is in the Desire forum? I'm confused...
alpha-dog said:
HTC might not update their new devices with ICS as it might slow down their phones with HTC Sense on top of it...
Click to expand...
Click to collapse
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sense with HW accel = WIN!
im just going by what theyve said
http://www.engadget.com/2011/10/19/htc-were-reviewing-ice-cream-sandwich-and-determining-our-plan/
actually I dont like Samsung phones... and both galaxy nexus and Sensation XE/XL are just TOO BIG >.>
Id take a HTC Bliss or wait some more
Azurael said:
Personally I wouldn't want either; holding my Atrix firmly in one hand, I can just about reach the top left and bottom left corners of the touchscreen with my thumb. That would mean on a device like this I'd have to use it two handed... Why are high-end phones getting so giant these days?!
But given the choice between the two, I'd take the Sensation XE. I suspect the Galaxy Nexus will suffer rather badly in 3D performance as a result of using an SGX 540 (albeit one paired with dual-channel memory) for such a high-res screen. We know for sure it's going to get walked all over by the iPhone 4S with its 960x640 screen w/SGX543MP...
Depending on when current-gen Tegra 2/OMAP 4/Exynos/Snapdragon S3 devices get ICS updates, they may prove to be a better choice for that reason alone. This time next year, 1280x720 will probably be more viable. I suspect A5 is the only SoC with the grunt right now. Still, I could be proven wrong.
Edit: This is in the Desire forum? I'm confused...
Click to expand...
Click to collapse
Sorry being a noob, but what makes the A5 better than the processor in the galaxy nexus? I thought the nexus was 1.5Ghz dual-core, surely that's faster?
I am in the excatly same position, Im due for an upgrade. It took a long time to decide I wanted the Sensation XE because it looks so good with the red highlights and the Sense UI is a big favourite after having the Desire for 2 years now.
Now the Nexus has been announched I dont know whether to wait and get that. The Nexus vanilla ICS is really tempting cause it looks so good and is such a simplistic OS, and I think once HTC sense gets layed over the top it wont look anythink like that. The things that put me off the Nexus is that I think the it's too big. 4.65 inch is massive. The sensation 4.3 inch is bordering on too big.
It's a hard decision when you may own it on a contract for another 2 years, that Nexus is so future proofed you could own it for 4 years and not brake sweat about being out of date.
Sorry if this is dumbed down but I'm not excatly an expert on the specifics but any advice would be a bonus?
theboymini said:
The Nexus vanilla ICS is really tempting cause it looks so good and is such a simplistic OS, and I think once HTC sense gets layed over the top it wont look anythink like that.
Click to expand...
Click to collapse
6chrisp said:
Sorry being a noob, but what makes the A5 better than the processor in the galaxy nexus? I thought the nexus was 1.5Ghz dual-core, surely that's faster?
Click to expand...
Click to collapse
It depends on how optimised the processor is to work with the phone. A5 is made specifically for iOS devices only so it will work way better than the OMAP processor used in the Galaxy Nexus as the processor has to adapt to a lot of different device who uses it. Also, Galaxy Nexus has a 1.2 Ghz Dual Core OMAP processor, not 1.5 Ghz.
Sent from my HTC Original Desire using Tapatalk
isnt the sensation XL the 16GB internal no microSD slot version?
if so i'd pass on that instantly.
The HTC Rezound (a.k.a Vigor) would be the one when or if it comes out and had the specs it's saying it might have.
theboymini said:
The HTC Rezound (a.k.a Vigor) would be the one when or if it comes out and had the specs it's saying it might have.
Click to expand...
Click to collapse
true - isnt that like the senesation xe + more ram? (1gb rather than 3/4GB)
For me either smaller device ~4 inches or Galaxy Nexus - basically HTC for me have two big assets - Sense and shells (casing). Sense is much more polished from GB but the gap narrows with ICS, especially with some extra apps. While SGS/Nexus S has completely ridiculous material on the back, SGSII and nexus Galaxy seems to find some improvements in this matter.
BTW: Thank God (and admins) for ignore feature in this forum...
Lothaen said:
true - isnt that like the senesation xe + more ram? (1gb rather than 3/4GB)
Click to expand...
Click to collapse
Shud have the 4.3 inch 720p display (like the nexus) and the 1gb ram. Only it they put in 16/32GB memory with expandable microSD it would be the one. Also, as YorickRise states a nice solid case rather than those plastic ones Samsung insists on making so their phones are lighter.
Just got to wait....
itachi1706 said:
It depends on how optimised the processor is to work with the phone. A5 is made specifically for iOS devices only so it will work way better than the OMAP processor used in the Galaxy Nexus as the processor has to adapt to a lot of different device who uses it. Also, Galaxy Nexus has a 1.2 Ghz Dual Core OMAP processor, not 1.5 Ghz.
Sent from my HTC Original Desire using Tapatalk
Click to expand...
Click to collapse
An Cortex A9 dual core is a Cortex A9 dual core. In terms of CPU performance there shouldn't be a great deal of difference between any of them unless the app is using NEON which isn't supported on the Tegra 2. And the dual core Qualcomms perform a bit different because Snapdragon is A8 compatible, but not actually based on the A8 or A9 cores, its Qualcomms own design. However, the GPU is different in all these chips, and that's a where the difference comes in. OMAP 4 as seen in the Galaxy Nexus and newer Moto devices uses the old sgx540 which you nay know from the original Galaxy S inside Samsungs hummingbird single cores. It was the best mobile GPU in its day and has been clocked much faster and paired with dual channel memory in the OMAP 4 giving it similar performance to Mali 400MP in the Exynos, Adreno 220 in the Qualcomms and slightly ahead of GeForce ULP in the Tegra 2. Not enough, IMHO for such a high-res screen such as that in the Galaxy Nexus. The shiny new sgx543mp in the Apple A5 blows all of the competition out of the water at the moment though! However we will see other SoCs in early 2012 that are competitive from a GPU standpoint.
Sent from my MB860 using Tapatalk
theboymini said:
Shud have the 4.3 inch 720p display (like the nexus) and the 1gb ram. Only it they put in 16/32GB memory with expandable microSD it would be the one. Also, as YorickRise states a nice solid case rather than those plastic ones Samsung insists on making so their phones are lighter.
Just got to wait....
Click to expand...
Click to collapse
Those flimsy plastic cases, while not feeling so great in the hand and looking pretty tacky for a highend device are actually a lot more impact resistant than the metallic bodied phones. Look for Galaxy SII drop tests on YouTube if you don't believe me. It mirrors the experiences I've had with laptops (I.e. Apples pro line with the metal cases have lasted a lot less well that the plastic-bodied non-pro models.)
Sent from my MB860 using Tapatalk
itachi1706 said:
It depends on how optimised the processor is to work with the phone. A5 is made specifically for iOS devices only so it will work way better than the OMAP processor used in the Galaxy Nexus as the processor has to adapt to a lot of different device who uses it. Also, Galaxy Nexus has a 1.2 Ghz Dual Core OMAP processor, not 1.5 Ghz.
Sent from my HTC Original Desire using Tapatalk
Click to expand...
Click to collapse
Actually that's a no. A dualcore 1.2 ghz processor contains exactly 2 cores capable of exactly 1.2 billion cycles per second. No more and no less, and compared to the apple a5 in the 4s clocked at 800 mhz that is a 50% increase.
The a5 gpu however is indeed a tad better
edit: and i have to completely disagree that apple in any way can utilize the raw processing power any better than other companies.. That is just plain wrong. They can and have however build the system to utilize the gpu for transition effects and simple animations just as google does in hc and onwards.
mortenmhp said:
Actually that's a no. A dualcore 1.2 ghz processor contains exactly 2 cores capable of exactly 1.2 billion cycles per second. No more and no less, and compared to the apple a5 in the 4s clocked at 800 mhz that is a 50% increase.
The a5 gpu however is indeed a tad better
edit: and i have to completely disagree that apple in any way can utilize the raw processing power any better than other companies.. That is just plain wrong. They can and have however build the system to utilize the gpu for transition effects and simple animations just as google does in hc and onwards.
Click to expand...
Click to collapse
iOS 5 is clearly better optimised than Android 2.3 in most benchmarks.... Especially in browser-related tests like JavaScript benches. However, we'll see how 4.0 handles as 3.0 on tablets is a lot closer to iOS benchmarked on similar hardware. And not all CPUs perform the same at the same clockspeed. Do you really think a dual core Atom (simple, in order core, limited cache and bandwidth and less execution units) performs the same as an i3 (complex, out-of-order core with loads of cache, loads of bandwidth and lots of execution units) at the same clock, for example? In fact, the reason I choce this particular comparison is it's an extreme with CPUs sharing the x86 instruction set; the i3 would be more than twice as fast in most cases.
CPU performance depends on a lot of things; for example the number of execution units inside capable of a given operation, pipeline length, cache optimisations, memory bandwidth, bus speeds, the efficiency of the instruction resceduler (for out of order CPUs) and a number of other factors. Even CPUs with the same cores (like ARM's A9 for example) can perform differently - some (like Ti's OMAP) have dual channel memory whereas Tegra 2, for example is constrained to a single channel, although this is much more likely to affect GPUs (which are also integrated and share memory bandwidth with the CPU) than CPUs with current cores. The CPU cores in Snapdragon S3, particularly, perform quite differently (a little worse in most cases) than other current-gen dual-core ARM chips due to their use of Qualcomm's Scorpion CPU core (which is an arm v7l chip compatible with, but not identical to an A8 - a single Scorpion is faster than a single A8 due to partial out-of-order support but the more complete out-of-order support and shorter instruction pipeline means A9 will perform better per core at the same clock than Snapdragons.)
And that's before we even mention instruction set extensions like NEON and SSE - when running code which is optimised for and can take advantage of these (which tend to be media-related apps like video encoding) you could end up with orders of magnitude difference in performance. The implementation of Sandy Bridges AVX extensions allow them to double performance at the same clock in linpack benchmarks versus the previous generation 'Nehalem' based chips for example.
Oh, and SGX543MP2 isn't just a tad faster than anything we have in Android hardware at the moment, it's A LOT faster, especially given that A5 uses dual channel memory and everything we have bar the TI OMAP 4 with its aging (though fast-clocked) SGX540 is single channel. Also bear in mind that the iPhone 4/s GPU is dealing with a 960x640=614,400 pixel display whereas many high-end Android devices (GSII for example) are still only packing 800x480 displays with 384,000 pixels and the GPU has to do a lot more work to render 60% more pixels! - be careful when comparing benchmarks!

Discussing the performance of the Tegra 3 SoC

Ipad 3's gpu, about twice the performance of the One X, but in OFFSCREEN 720p mode:
Check it out, http://glbenchmark.com/result.jsp?b...ersion=all&certified_only=1&brand=all&gpu=all
Would that mean there are no worries for game's performance? Since we have much lower resolution?
What i was thinking is that game makers who optimize their games for the new ipad, wouldn't make them run at native resolution, but something around 720p, and that would mean our One X is simply incapable of running ipad's optimized games, with less than half the performance of the ipad.
What do you think?
eeeeeee said:
Check it out, http://glbenchmark.com/result.jsp
Would that mean there are no worries for game's performance? Since we have much lower resolution?
What i was thinking is that game makers who optimize their games for the new ipad, wouldn't make them run at native resolution, but something around 720p, and that would mean our One X is simply incapable of running ipad's optimized games, with less than half the performance of the ipad.
What do you think?
Click to expand...
Click to collapse
so why not post this in the mega thread mate
hamdir said:
so why not post this in the mega thread mate
Click to expand...
Click to collapse
The mega thread is already pointless, as there is a forum for the One X,
and some subjects there are being neglected or not getting as much attention as they deserve.
http://glbenchmark.com/phonedetails.jsp?D=Apple+iPad+3&benchmark=glpro21
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
720p offscreen
Egypt iPad3 140.9
Egypt iPad2 88.8
Egypt Prime 68
Egypt One X 64
Egypt One S 50
Pro iPad3 252.1
Pro iPad2 148.8
Pro Prime 81
Pro One X 82
Pro One S 76
Standard (native resolution)
Egypt iPad3 59.9 @2048x1536
Egypt iPad2 59.6 @1024×768
Egypt Prime 46.8 @1280x800
Egypt One X 51 @1280x720
Egypt One S 57 @540×960
Pro iPad3 60 @2048x1536
Pro iPad2 60 @1024×768
Pro Prime 54 @1280x800
Pro One X 54 @1280x720
Pro One S 60 @540×960
so we are talking between 2x and 3x the T3 and not 4x like Apple claimed
iPad3 CPU same as iPad2
http://www.engadget.com/2012/03/13/new-ipad-gets-benchmarked-1gb-ram-confirmed-no-boost-in-cpu-sp/
hamdir said:
http://glbenchmark.com/phonedetails.jsp?D=Apple+iPad+3&benchmark=glpro21
720p offscreen
Egypt Ipad3 140.9
Egypt Ipad2 88.8
Egypt Prime 68
Egypt One X 64
Egypt One S 50
Pro Ipad3 252.1
Pro Ipad2 148.8
Pro Prime 81
Pro One X 82
Pro One S 76
so we are talking between 2x and 3x the T3 and not 4x like Apple claimed
Click to expand...
Click to collapse
Thanks, indeed a little more than 2x on some benchmarks, although never 3x.
Now the big question here is whether the ipad will be upscaling games or rendering in native resolution.
Another question would be, is how does upscaling look (1:4) versus native low resolution (1:1).
These details will really affect my decision whether to get the One X, since I'm not planning to buy an outclassed gpu, as upscaled games will be able to run 2x on the ipad 3, and the One X already incapable of keeping up before being released.
eeeeeee said:
Thanks, indeed a little more than 2x on some benchmarks, although never 3x.
Now the big question here is whether the ipad will be upscaling games or rendering in native resolution.
Another question would be, is how does upscaling look (1:4) versus native low resolution (1:1).
These details will really affect my decision whether to get the One X, since I'm not planning to buy an outclassed gpu, as upscaled games will be able to run 2x on the ipad 3, and the One X already incapable of keeping up before being released.
Click to expand...
Click to collapse
most game developers on twitter said the iPad3 GPU is not enough to feed the massive pixel count, so yes anticipate a lot of upscaling, Anandtech is making the same assumption
you keep forgetting that most Tegrazone T3 enhanced games used the quad cores for graphics....so the T3 standalone GPU will not keep up but the Quads will make up for it
its really not Apple to Apple comparison, the T3 is designed as a complete mobile graphics solution, Shadowgun THD and Glowball demos are real world examples
hamdir said:
most game developers on twitter said the GPU will not be enough to feed the massive pixel count, so yes anticipate a lot of upscaling, Anandtech is making the same assumption
you keep forgetting that most Tegrazone T3 enhanced games used the quad cores for graphics....so the T3 standalone GPU will not keep up but the Quads will make up for it
its really not Apple to Apple comparison, the T3 is designed as a complete mobile graphics solution, Shadowgun THD and Glowball demos are real world examples
Click to expand...
Click to collapse
But to compete with the new ipad one must optimize the game for the T3 in the code level, which is never gonna happen outside of Nvidia's Tegra Zone.
eeeeeee said:
But to compete with the new ipad one must optimize the game for the T3 in the code level, which is never gonna happen outside of Nvidia's Tegra Zone.
Click to expand...
Click to collapse
why? most games are optimised for their platforms, i didnt even think the DHD with Adreno205 can run shadowgun and it did when optimised for it
Unity developers documents reveal they optimize for every main GPU in the market
and regarding Tegrazone! its one of the best reasons to buy a T3, you have a powerhourse like Nvidia pushing devs to optimize for ti
same case with the iPad, devs have to optimize for the tile based GPU believe me keeping all those pixel inside a tile based buffer will be a major headache for iPad3 games
i really have doubts the iPhone5 will carry the 543MP4 if they want to keep parity with the iPad3 they will simply bump the speed iPhone4s CPU from 800 to 1000 and leave the old GPU
---------- Post added at 12:41 PM ---------- Previous post was at 12:39 PM ----------
if you make GPU benchmarks between Xbox 360 and Ps3 the Xbox will mope the floor withe PS3 GPU
but in real world there are many PS3 games the Xbox360 can not even make, while 90% of the time they match
Ps3 has the cores to assist its inferior GPU and Xbox360 has the GPU assisting its inferior CPU, very similar scenario here mate
in Desktop PC development they already moving to unify CPU/GPU cores
Finally
we already don't have a games match between android and apple, so whats your best choice for GPU if you skip T3? non my friend there is non, T3 is the best SOC for graphics right now on Android
hamdir said:
why? most games are optimised for their platforms, i didnt even think the DHD with Adreno205 can run shadowgun and it did when optimised for it
Unity developers documents reveal they optimize for every main GPU in the market
and regarding Tegrazone! its one of the best reasons to buy a T3, you have a powerhourse like Nvidia pushing devs to optimize for ti
same case with the iPad, devs have to optimize for the tile based GPU believe me keeping all those pixel inside a tile based buffer will be a major headache for iPad3 games
i really have doubts the iPhone5 will carry the 543MP4 if they want to keep parity with the iPad3 they will simply bump the speed iPhone4s CPU from 800 to 1000 and leave the old GPU
Click to expand...
Click to collapse
Offloading graphics to cpu cores is very unnatural, and requires hard modifications to your code.
Again I'm not a graphic expert, but it really depends on one's coding style, becuase if you code your game well enough, it will not be hard to port it over to tegra 3, however, if there's no sign of threads or any seperation of processes in your game, you will have to literally re-develope the game for the tegra 3.
Now we all know the fact that developers favor ios over android almost in any case, especially for gaming, we can expect really bad performance in my opinion in less than a year of holding the One X.
By the way the fact that tegra 3 is the best soc out there for android, kind of depresses me.
I wanted to see android smartphone manufactoreres as htc and samsung adopt the powervr solution, since it's sadly much much better than any other mobile gpu in the market.
yes but i am a graphics expert
We don't have a parity in iOS games vs Android, thanks to Tegrazone we have a lot more games
in fact Tegra3 already runs most games a lot better than on iPad2 with much higher resolution
i don't get your point
the competition is T3 vs iPad2 and not iPad3 since the extra cores will simply serve to feed more pixels (even if slightly upscaled the massive pixel count is beyond 4x) the T3 has 1.5 to 2x competition at most
if you have time read the unity development document to get a better idea
you still didn't answer me what is your alternative? buying an iOS device?
hamdir said:
yes but i am a graphics expert
We don't have a parity in iOS games vs Android
in fact Tegra3 already runs most games a lot better than on iPad2 with much higher resolution
i don't get your point
the competition is T3 vs iPad2 and not iPad3 since the extra cores will simply serve to feed more pixels, the T3 has 1.5 to 2x competition at most
if you have time read the unity development document to understand
you still didn't answer me what is your alternative buying an iOS device?
Click to expand...
Click to collapse
Just to make sure we understand each other, I'm never even thinking of buying an ios device, the os sucks, but hell it's devices have very good hardware, and have the developers optimize things for them first.
You are right that more cpu power CAN be better and therefore the tegra 3 might perform better, however, I'm really honest when I ask whether we will have the developers support for every app and game that requires performance? or our soc gets neglected,
filled with choppy and stuttering games using only two cores half the power and pushing everything to the gpu?
Just look at adobe photoshop touch, tegra 3 performs like 5 fps zooming and panning while ipad 2 is 60fps.
PowerVR tiling has its performance downhills too
an upcoming android phone with intel SOC + PowerVR SGX544 is coming so maybe you should consider that one
as for me im rushing to Nvidia and never coming back its thanks to them that we are seeing a lot more games on android
hamdir said:
PowerVR tiling has its performance downhills too
an upcoming android phone with intel SOC + PowerVR SGX544 is coming so maybe you should consider that one
as for me im rushing to Nvidia and never coming back
Click to expand...
Click to collapse
I don't think I will go for x86, but have a look at my edited previous post about adobe photoshop as an app that isn't optimized for tegra 3, and nobody really cares..
regarding Photoshop touch, i hope its a development problem and not the limited memory bandwith on T3 we should ask nivida about this
hamdir said:
regarding Photoshop touch, i hope its a development problem and not the limited memory bandwith on T3 we should ask nivida about this
Click to expand...
Click to collapse
Either one proves my point, that tegra 3 is useless without optimizations, or that tegra 3 is simply not good enough. Shame on Nvidia, what can I say.
eeeeeee said:
Either one proves my point, that tegra 3 is useless without optimizations, or that tegra 3 is simply not good enough. Shame on Nvidia, what can I say.
Click to expand...
Click to collapse
mate optimising for certain hardware is not wrong! in fact all hardware requires this, how else would you push things forward?? otherwise lets stick PowerVR and ARM in every device and call it quits so devs don't have to "optimize"
hamdir said:
mate optimising for certain hardware is not wrong! in fact all hardware requires this, how else would you push things forward?? otherwise lets stick PowerVR and ARM in every device and call it quits so devs don't have to "optimize"
Click to expand...
Click to collapse
I accidently pressed the thanks button =] although if I think it over you do deserve one after your useful posts.
Every hardware requires optimizations, but history has proven that when you have a device that your software has to be over optimized for you to gain good performance, the developer might neglect it and just prefer worse performance - we are being witnessed to exactly that case when comparing adobe photoshop touch over the platforms, and I can deliver many other examples.
ok regarding up-scaling
for sure iPad3 devs will use ups-calling but not by 4x, anything higher than 2x up-scaling will show its ugly face
in fact the only worry i have about T3 in general is not the GPU but the limited memory bandwidth but most tablets aver 720p and the T3 seems to keep up well, its also handles 1080p H264 very well
heavy games like infinity blade might suffer from the bandwidth though
but consider this, i always wanted to move to Tegra, why? because of a much better game support, you can't deny it's Tegrazone that started pushing high end 3d games to android
regarding optimisations, everyone said the same about the PS3 but with Sony backing it proved them wrong, i find it really wrong that we expect to pamper developers just because they are now comfortable with PowerVR thanks to iOS
hamdir said:
ok regarding up-scaling
for sure iPad3 devs will use ups-calling but not by 4x, anything higher than 2x up-scaling will show its ugly face
in fact the only worry i have about T3 in general is no the GPU but the limited memory bandwidth but most tablets aver 720p and the T3 seems to keep up well
heavy games like infinity blade might suffer from the bandwidth though
Click to expand...
Click to collapse
Good subject, do you have any details about games in the transformer prime, or even apps that suffer from it already?
Regarding the PS3 analogy, I honestly think it's irrelevent simply because PS3 is a permanent targeted platform, which I hope tegra 3 remains for a long period of time.
was Nvidia wrong by not shooting for the moon with its GPU and memory bandwith? yes but that's why they were the first to achieve a quad mobile SOC while Apple A6, ARMA15 and quad s4 are far away
oh and a very important point
according to unity Nvidia Tegra has the best development tools and performance analyzers, qualcomm comes second while the iPad2 tools are lacking
all we know about the Prime is that all THD games running a lot better vs iPad2
there is a reason why so many say "**** benchmarks"

One X vs. One S. Performance and dev

Getting a new phone as I ran over my Razr with my landcruiser 40..
Live in Norway so I would be getting the EU version of the X with tegra 3.
But looking at the benchmarks of the us version (dual core) of the X, it is clearly very fast. Wondering if we would get similar performance out of the S? And would it be as "xda friendly" as I suspect the X will become?
Money is not the issue, just not sure if I would be comfortable with such a large phone.. (well, the Razr had bezels from hell, so it was very wide)
buljo said:
Getting a new phone as I ran over my Razr with my landcruiser 40..
Live in Norway so I would be getting the EU version of the X with tegra 3.
But looking at the benchmarks of the us version (dual core) of the X, it is clearly very fast. Wondering if we would get similar performance out of the S? And would it be as "xda friendly" as I suspect the X will become?
Money is not the issue, just not sure if I would be comfortable with such a large phone.. (well, the Razr had bezels from hell, so it was very wide)
Click to expand...
Click to collapse
One S Krait is fast for single app and for 2 apps, but clearly Tegra 3 outperforms Krait in multi-app and gaming performance. So i would say Tegra3 is more future than dualcore krait.
Sent from my GT-I9000 using xda premium
HTC one s
i m a lucky guy with already an HTC One S. I did the same benchmark than the ones published for the HTC One XL and I ve got the same results.
Rastasia said:
i m a lucky guy with already an HTC One S. I did the same benchmark than the ones published for the HTC One XL and I ve got the same results.
Click to expand...
Click to collapse
Proof please of phone ownership.
It doesn't make any sense for HTC to release their flagship with a less powerful processor than their "mid-range" phone.
It wouldn't be the first time a mobile phone company forgot about what makes sense, but it's not like the HTC one X is going to be underpowered, regardless.
I really don't like the look of the pentile screens, which was the main deciding factor for the One X for me.
From the comments on that benchmark blog post it seems the tests are unrealistic; the scores for the alternatives are artificially low (iirc)
One S at its native QHD res vs One X at its native HD res, they trade blows and almost equal, the One X will show its muscles in quad optimised apps only
as for One X vs One XL = One X is better since T3 is better than dual s4 @ 720p
qpop said:
Proof please of phone ownership.
It doesn't make any sense for HTC to release their flagship with a less powerful processor than their "mid-range" phone.
It wouldn't be the first time a mobile phone company forgot about what makes sense, but it's not like the HTC one X is going to be underpowered, regardless.
I really don't like the look of the pentile screens, which was the main deciding factor for the One X for me.
From the comments on that benchmark blog post it seems the tests are unrealistic; the scores for the alternatives are artificially low (iirc)
Click to expand...
Click to collapse
It doesn't makes sense, but they have crippled the one s with the pentile screen and low storage.
The s4 chip in the one s is a generation ahead of anything else until the arm A15 chips arrive. Qualcomm krait is supposed to be much closer to a15 spec than the A9 in tegra 3.
proof
http://www.flickr.com/photos/[email protected]/7026471353/
http://www.flickr.com/photos/[email protected]/6880371452/
http://www.flickr.com/photos/[email protected]/7026471425/
Rastasia said:
http://www.flickr.com/photos/[email protected]/7026471353/
http://www.flickr.com/photos/[email protected]/6880371452/
http://www.flickr.com/photos/[email protected]/7026471425/
Click to expand...
Click to collapse
congrats on your sexy beast mate, i love the One S
but these benchs are nothing new, Velloma is a not heavily multi-threaded Qualcomm test
excellent Device, what color did you get?
Thé blue/grey one. Do u want me to test on an other benchmark?
Sent from my HTC One S using XDA
Rastasia said:
Thé blue/grey one. Do u want me to test on an other benchmark?
Sent from my HTC One S using XDA
Click to expand...
Click to collapse
best color mate! i bet its gorgeous
yes try Antutu https://play.google.com/store/apps/details?id=com.antutu.ABenchMark&feature=search_result
and GL benchmark offscreen 720p tests https://play.google.com/store/apps/details?id=com.glbenchmark.glbenchmark21&feature=search_result#?t=W251bGwsMSwxLDEsImNvbS5nbGJlbmNobWFyay5nbGJlbmNobWFyazIxIl0.
congrats on having the new device..
may i ask why did u got it so fast... ?!?!
fi3ry_icy said:
congrats on having the new device..
may i ask why did u got it so fast... ?!?!
Click to expand...
Click to collapse
it s a test device from a provider
---------- Post added at 02:47 PM ---------- Previous post was at 02:37 PM ----------
hamdir said:
best color mate! i bet its gorgeous
yes try Antutu https://play.google.com/store/apps/details?id=com.antutu.ABenchMark&feature=search_result
and GL benchmark offscreen 720p tests https://play.google.com/store/apps/details?id=com.glbenchmark.glbenchmark21&feature=search_result#?t=W251bGwsMSwxLDEsImNvbS5nbGJlbmNobWFyay5nbGJlbmNobWFyazIxIl0.
Click to expand...
Click to collapse
can t get your second benchmark but here s your first request
http://www.flickr.com/photos/[email protected]/7026577361/
Rastasia said:
it s a test device from a provider
---------- Post added at 02:47 PM ---------- Previous post was at 02:37 PM ----------
can t get your second benchmark but here s your first request
http://www.flickr.com/photos/[email protected]/7026577361/
Click to expand...
Click to collapse
thanks you just confirmed my point of view, Antutu shows off the quads a lot better than a qualcomm test
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
HTC ONE X is a wat better with Tegra
hamdir said:
thanks you just confirmed my point of view, Antutu shows off the quads a lot better than a qualcomm test
Click to expand...
Click to collapse
This is proof
source
http://www.youtube.com/watch?v=TmWRaaAteZg
those who haven't checked Anandtechs' review of the iPad3 i suggest you do, its full of juicy information
having settled all this info in my mind, it's quit easy to draw a clear picture
the Tegra3 is a chip-set that jumped most competitors with such an early entry to quad mobile CPUs, the only other quad in the market is the PS Vita's SOC (Sony CXD5315 build by Toshiba), both quad A9, early worries about memory bandwith/L2 cache are unfounded simply because ARM A9's memory controller can't keep up with more
Snapdragon 4 introduced dual memory channel, major optimization and performance per core, however its advantage is offset by the lack of cores, its a good design for quad but right now its excellent cores will still be stalled by multi-tasking and it's amazing memory bandwidth will go to waste
Right now Tegra3 is still the best Mobile CPU you can have, better than the A5x and dual S4, however the major let down of the Tegra 3 is it's GPU
Nvidia claimed 12 cores GPU on the T3 but that's simply the number of SIMDs and not physical cores, ARM also names its SIMDs as core, but this only confuses customers, Nvidia will use this naming scheme to counter Apple claims, like Asus already responded on twitter
ARM SGX543 MP series has physical core scaling, but its only 8 SIMDs per Core, vs 12 on the Tegra 3, 8 on Adreno225 and 5 on Mali-400, the A5 has 2 cores and hence 16 SIMDs while the A5x has 4 cores and hence 32 SIMDs
In reality the Tegra3 GPU falls a little short of the iPad 2 GPU, while it beats Mali-400 and Adreno225 in most situations but not all areas, Nvidia extracted all they can from this GPU by some aggressive drivers optimization and hacks, this is how they achieved their 3x Tegra2 claim i.e: its already optimized don't expect much room here
Nvidia's GPU is really disappointing but not a disaster, it just doesn't hold a lot of overhead, right now its still the fastest GPU for android and has the quad to back it up once an app is T3 optimized, the quads can add console quality gameplay additions like ragdoll, physics and particles but might not improve FPS (this will require an engine written from grounds up for multi core and i doubt devs will be inclined)
The iPad3 GPU is massively powerful, a testament to the PS Vita's GPU, however unlike the Vita its power is wasted on those pixels and hence games will benefit from it but not the 2x jump from current iPad2/iPhone4s games, like infinity blade 2 shows, it only managed a 1.4x resolution increase without loosing frame rate, so yes most 3D games on the iPad3 will not be retina boosted, why do i keep bringing up iPad? because iOS is still the leader when it comes to mobile gaming and most games we get on Android are ports, the future of iOS games will draw the future of Android games
All this makes me conclude the following
Android's main appeal is still the OS, what you can do with it and multi tasking, which translates into the main appeal for Tegra3, its ridiculous to even think quad cores can not benefit such a heavily multi-threaded OS
Android is still not the best platform for gaming but wether we like it or not, it's best grounds for gaming is Tegrazone simply because we have Nvidia pushing/bribing developers in this direction
If you are buying an Android device right now the best you can do is Nvidia Tegra3 but damn you Nvidia for not being more generous
its been the case for ages, asymmetry between CPU and GPU power, xbox360 had a more power GPU against its CPU, PS3 had the CPU against its GPU, Apple A5x has its GPU against last year CPU, the only SOC that satisfies both angles is PS Vita with its quad CPU and quad GPU but that's because Sony has to worry about the product life cycle which is over 4 years
so you can see Tegra3 has the CPU against its GPU, its not really breaking the norms in here, its business as usual
went for the S.
well, just posted in the S forums..
http://forum.xda-developers.com/showthread.php?p=24454178#post24454178
ended up with the one S instead of the X. the feel of the phones did it for me.

Categories

Resources