[Q] Is exynos worth buying? - Galaxy Note 10.1 (2014 Edition) Q&A, Help & Troubl

The snapdragon version isn't available in my country, so I will have to buy the exynos (Pretty cheap right now $500 equivalent). The thing is reviews say the snapdragon doesn't lag a bit while exynos is made for a large device.
Is the performance really this bad? I'm not into eons right now by the way.

No its not worth buying the snapdragon version. My s4 is faster than my note...
Sent from my GT-I9505 using xda app-developers app

With HMP enabled there is no comparison between the two, exynos is up to 50% faster and potentially more efficient. With HMP disabled (as things currently are) then qualcomm is the slightly better chip, but I'm not convinced that the difference is enough to prefer one soc over the other...
In short Exynos 5420 is artificially neutered to seem worse than qualcomm, yet -even so- going either way won't make much of a difference...

Do you have any benchmarks to prove your claim of a speed bump of 50 %?
to OP
There are a lot of threads about Exynos vs snapdragon, long story short
Exynos , tad better cpu
Snapdragon tad better gpu
I've had both, ended with exynos , because I didn't need 4g, but needed 32 GB ( in scandinavia 4 G seems to be 16 gb only)
Lag was more or less the same
I felt the battery time on the exynos was a tad better
They felt equally as snappy when they needed to
BUT!!!
App support was a tad better on Snapdragon, ie more apps in the plastore worked with the snapdragon version, a few more games etc... no big deal for me, but still get me ticked of when I noticed a few apps I bought weren't compatible ( yet?!) with the new exynos chip ( but worked with my sammy S3 also exynos chip, older )

Exynos is fine. I've played with both and from a UI and app use perspective you can't tell the difference. Adreno's a bit faster than Mali but no so much as to drastically alter performance. Some games are better optimized for Adreno so depending on your choice of games it could make a difference. As for app compatibility it's more likely the 2,560x1,600 display that's causing the issue not the specific SoC. If there were huge differences between Exynos and S-800 or drastic app performance differences and app compatibility issues it would be all over the N3 forum and it's not.

DeBoX said:
Do you have any benchmarks to prove your claim of a speed bump of 50 %?
to
Click to expand...
Click to collapse
HMP for 8 cores have not yet released but look at Note 3 Neo, it uses 2 less large cores and it posts the same antutu score as our note, so by adding two more large cores you can expect the score to be about 50% more. As I said that is only true were all 8 cores would be used at the same time and they are not throttled (that is why I said "up to").

Stevethegreat said:
Look at Note 3 Neo, it uses 2 less large cores and it posts the same antutu score as our note
Click to expand...
Click to collapse
Not really. It also has a 267 PPI display which is benefitting its graphics scores in AnTuTu compared to the SGS4 at 441 PPI and N3 at 386 PPI.
http://www.nairaland.com/1597298/samsung-budget-galaxy-note-neo
S-800 vs. Exynos on the N3...

BarryH_GEG said:
Not really. It also has a 267 PPI display which is benefitting its graphics scores in AnTuTu compared to the SGS4
Click to expand...
Click to collapse
I was more properly referring to CPU scores which are the only ones benefitted from HMP.
I ran a quick AnTuTu (cpu) test to my Exynos 5420 equipped note and here are the results: http://i.imgur.com/zD32DZQ.png
Notice how remarkably similar they are to note neo's cpu score:
http://www.gsmarena.com/showpic.php3?sImg=newsimg/14/01/sgn3n-leak/gsmarena_006.jpg&idNews=7538
Note that note neo has only two large cores which are clocked lower by 10% compared to exynos 5420 and it still posts almost the same score merely by employing the help of the small cores. Now add two large cores more and you'd get 50% more performance, it's simple math really...
Now I'm not saying that it would be a performance that we would actually see in most occasions , it would either be throttled or -even- not supported by most apps but still it's potentially there (which was my point by saying "up to").
What will *definitely* be there if HMP is to be enabled is better battery -though- as it would make more efficient use of the small cores. Since exynos 5422 is also on 28nm yet has HMP enabled leads me to believe that we lack HMP for strategic reasons (so that samsung will sell more exynos 5422 / qualcomm equipped machines)

Stevethegreat said:
I was more properly referring to CPU scores which are the only ones benefitted from HMP.
I ran a quick AnTuTu (cpu) test to my Exynos 5420 equipped note and here are the results: http://i.imgur.com/zD32DZQ.png
Notice how remarkably similar they are to note neo's cpu score:
http://www.gsmarena.com/showpic.php3?sImg=newsimg/14/01/sgn3n-leak/gsmarena_006.jpg&idNews=7538
Note that note neo has only two large cores which are clocked lower by 10% compared to exynos 5420 and it still posts almost the same score merely by employing the help of the small cores. Now add two large cores more and you'd get 50% more performance, it's simple math really...
Now I'm not saying that it would be a performance that we would actually see in most occasions , it would either be throttled or -even- not supported by most apps but still it's potentially there (which was my point by saying "up to").
What will *definitely* be there if HMP is to be enabled is better battery -though- as it would make more efficient use of the small cores. Since exynos 5422 is also on 28nm yet has HMP enabled leads me to believe that we lack HMP for strategic reasons (so that samsung will sell more exynos 5422 / qualcomm equipped machines)
Click to expand...
Click to collapse
You can't divorce the impact of display area size and PPI from CPU performance. The GPU doesn't absolve the CPU's role in graphics output. An i3 PC with a killer graphics card will perform worse graphically than an i7 PC with a lesser card because most computational (not rendering, texture mapping, vectoring, and decoding) work is still done on the CPU. So I have no idea what AnTuTu's testing to come up with a CPU rating in isolation but if it's a real-time performance test the CPU's role in graphics output is impacting it. So comparing the Neo with a 5.5" display and 267 PPI against the N10.1-14 with a 10.1" display and 299 PPI isn't going to get you a relevant CPU comparison. That's why I used the N3 and SGS4 as comparisons because only the PPI is off. And the Neo would be well behind the SGS4 in the cumulative AnTuTu test if it had the same PPI because the lower workload of the lower PPI is artificially enhancing its score. At the end of the day an isolated CPU number is pretty meaningless. It's like bench horsepower in a car vs. horsepower to the wheels. A higher bench rating means nothing because none of us drive an engine, we drive a car. The total AnTuTu number (AKA: drive train loss) is more relevant even though it doesn't support the point you're trying to make about HMP.
http://en.wikipedia.org/wiki/Graphics_processing_unit#Computational_functions

BarryH_GEG said:
You can't divorce the impact of display area size and PPI from CPU performance. The GPU doesn't absolve the CPU's role in graphics output. An i3 PC with a killer graphics card will perform worse graphically than an i7 PC with a lesser card because most computational (not rendering, texture mapping, vectoring, and decoding) work is still done on the CPU. So I have no idea what AnTuTu's testing to come up with a CPU rating in isolation but if it's a real-time performance test the CPU's role in graphics output is impacting it. So comparing the Neo with a 5.5" display and 267 PPI against the N10.1-14 with a 10.1" display and 299 PPI isn't going to get you a relevant CPU comparison. That's why I used the N3 and SGS4 as comparisons because only the PPI is off. And the Neo would be well behind the SGS4 in the cumulative AnTuTu test if it had the same PPI because the lower workload of the lower PPI is artificially enhancing its score. At the end of the day an isolated CPU number is pretty meaningless. It's like bench horsepower in a car vs. horsepower to the wheels. A higher bench rating means nothing because none of us drive an engine, we drive a car. The total AnTuTu number (AKA: drive train loss) is more relevant even though it doesn't support the point you're trying to make about HMP.
http://en.wikipedia.org/wiki/Graphics_processing_unit#Computational_functions
Click to expand...
Click to collapse
Maybe so, but the benchmark in question runs off screen. So while in real life resolution matter in Antutu Cpu score, or super pi , or, or, it doesn't. HMP will make the Cpu 50% faster in multi threaded operations, I never claimed it makes the total machine faster by the same amount. For example an HMP equipped note 2014 will score around 40000 in Antutu , NOT 49500. I don't see where we disagree, I merely think you misunderstood my initial claim

If you live for real world use, the Exynos Note is a wonderful tablet. If you live in the world of needing the highest quadrant and antutu scores you should pass.
Sent via Tapatalk and my thumbs.

Stevethegreat said:
With HMP enabled there is no comparison between the two, exynos is up to 50% faster and potentially more efficient. With HMP disabled (as things currently are) then qualcomm is the slightly better chip, but I'm not convinced that the difference is enough to prefer one soc over the other...
In short Exynos 5420 is artificially neutered to seem worse than qualcomm, yet -even so- going either way won't make much of a difference...
Click to expand...
Click to collapse
How did you enable HMP? My note 3 snap dragon is so much faster than my note.
Sent from my SM-N900T using XDA Premium 4 mobile app

Stevethegreat said:
HMP for 8 cores have not yet released but look at Note 3 Neo, it uses 2 less large cores and it posts the same antutu score as our note, so by adding two more large cores you can expect the score to be about 50% more. As I said that is only true were all 8 cores would be used at the same time and they are not throttled (that is why I said "up to").
Click to expand...
Click to collapse
It will never be released for Exynos 5420 either, unless Samsung want alot of complains about fried Exynos 5420 chipsets. Also they already said it wont release HMP for Exynos 5420 cause of the heat.

dt33 said:
It will never be released for Exynos 5420 either, unless Samsung want alot of complains about fried Exynos 5420 chipsets. Also they already said it wont release HMP for Exynos 5420 cause of the heat.
Click to expand...
Click to collapse
Once again, that's not the reason that they won't release it, if anything the chip would be cooler because more use of A7 cores would be possible and if all 8 cores are needed Samsung could choose to throttle the thing. The reason that they don't release it is the Exynos 5422 which is the same chip but with all 8 cores enabled (also 28nm)...
So no fried socs, lesser profits more like

Related

[Q] Thoughts on 'weak' GPU?

Hi
Noob here. Was just wondering what the general consensus was on the 'old' GPU that the Galaxy Nexus will be supplied with. It isn't as powerful as say, the Galaxy S2 or the iPhone4s, but will this have an overall effect on how the phone performs in day to day use? Will it only effect the high end games that are currently available? I'm seriously tempted by this phone - mainly due to the lovely looking ICS but I'm concerned I may regret purchasing if there are serious issues with the GPU.
Cheers.
I am presuming the GPU is clocked all the way up to 384Mhz like the chip's specification says, if not then curses to Google.
TBH I believe it'll be fine, although it is an old GPU it is still quite a powerful one and can handle almost every game fine. Tegra 2 is generally a weaker GPU than SGX540 @ 200 and can still manage games just fine at 1280 x 800, I don't see why the SGX540 @ 384 can't do that. Although we'll never know for sure until we get the phone.
I'll quote myself from the other thread here:
Here's a lovely anecdote: I use an Eyefinity (three-monitor) setup on my gaming rig. It's a general rule of thumb that (compared to a single 1080p monitor) adding an additional 1080p monitor will reduce your performance by about 30%. A third 1080p monitor will reduce your performance to about 50% that of a single-screen setup.
Now consider, the Nexus Prime has about 2.4 times the number of pixels as the Nexus S. If the same formula as a desktop GPU holds true for mobiles, we could expect about a 40% loss in 3D performance. Now the GPU has been clocked up about 92%. It's throughput is now approaching double that of the Galaxy S, when it needed only make up a 40% defecit. Of course if you consider diminishing returns from clockspeed scaling, the [email protected] should perform at 720p about as well as it did at 200MHz and 480p. /shrug
The usual disclaimer: this was all conjecture on my part.
Click to expand...
Click to collapse
Like I said, that's just my theory, and it's got no real grounding (since I haven't used the new Nexus yet.)
Hope you guys are right, of course I'm not going to hold you to it, I just would like to have seen fresh architecture.
If we get a kernel, or I should say when we get a kernel that allows overclocking, does that only OC the CPU cores or will it OC the GPU even more?
Sent from my Droid using Tapatalk
And while you guys are skeptical of a gpu more powerful than the Geforce on the Tegra 2, which has its own games zone dedicated to it's well-known-to-be-awesome-or-atleast-marketed-well performance, I'm rocking an Adreno 200 powering a thoroughly shattered-yet-still-working-perfectly 4.3" WVGA standard LCD display. That powervr is probably more powerful than my Geforce3 ti 200 on my desktop.
I need a refresh.
Andreno200 < Adreno205 < [email protected] < [email protected] < or = Adreno 220
The Andreno 205 is 2X the 200, but the SGX is around 1.5X Adreno205, 220 is 2X Andreno205...So [email protected] is similar to Andreno 220 at same res, but slower at 720P?
I think it's stupid that people think it's weak because:
* It isn't brand new
* They've never seen it clocked like it is and/or matched with the OMAP processor it's matched with.
* Have never played a game optimized for it
* Can't name a game/movie/program that will run on something else but not the combination mentioned above
* Assume that superficial benchmark results mean much in real world applications
The entire conversation is like talking about a way to make your race car's top speed go from 210mph to 230mph on a track that is designed to make it impossible to go faster than 175mph.
For the last time, this is NOT the same GPU that is in the SGS.
Dragooon123 said:
I am presuming the GPU is clocked all the way up to 384Mhz like the chip's specification says, if not then curses to Google.
TBH I believe it'll be fine, although it is an old GPU it is still quite a powerful one and can handle almost every game fine. Tegra 2 is generally a weaker GPU than SGX540 @ 200 and can still manage games just fine at 1280 x 800, I don't see why the SGX540 @ 384 can't do that. Although we'll never know for sure until we get the phone.
Click to expand...
Click to collapse
You got it wrong there dude, SGX 540 @304 Mhz is equal or +1 to 2 % faster then the Geforce ULP GPU @800x480( Note that this can be because of the Dual channel memory the 4430 soc uses (optimus 3d)). Also the ULP Geforce does not work the same way as the SGX. Geforce ULP has the tendency to not get major performance hits when resolution gets bigger hence why all tablets use Tegra 2(Got a source for this however cant find it right now), it was Nvidias plan all along to grab the Tablet market.
I hope the extra Mhz helps the sgx 540 to perform well on the galaxy nexus when it comes to Games and so on. If it doesn't there are tricks to bypass things and get good performance in gaming however it up to google/samsung to implement them
I'm looking forward to try the phone myself when it hits the stores, and hope it'll be ok...
taxas said:
You got it wrong there dude, SGX 540 @304 Mhz is equal or +1 to 2 % faster then the Geforce ULP GPU @800x480( Note that this can be because of the Dual channel memory the 4430 soc uses (optimus 3d)). Also the ULP Geforce does not work the same way as the SGX. Geforce ULP has the tendency to not get major performance hits when resolution gets bigger hence why all tablets use Tegra 2(Got a source for this however cant find it right now), it was Nvidias plan all along to grab the Tablet market.
I hope the extra Mhz helps the sgx 540 to perform well on the galaxy nexus when it comes to Games and so on. If it doesn't there are tricks to bypass things and get good performance in gaming however it up to google/samsung to implement them
Click to expand...
Click to collapse
That might be the case but I saw SGX540 outperforming tegra at 720p, so even then at a tablet resolution the SGX540 doesn't fail to perform. Regardless, the gpu in galaxy nexus is nothing short of high end and should perform fine.
Sent from my GT-I9000 using Tapatalk
...and Tegra isn't that great either!
Regardless of whether the phone is fast or not, there is the overwhelming feeling that it could have been better. I think most people wanted a 543MP2 or if it were possible, the 543MP4+ (it isn't) on th Vita.
Sent from my SPH-D700 using xda premium
There is no soc out yet apart from the A5 with the 543mp2, the lead time on a soc is huge, i mean they were designing the OMAP 4460 back in 2009 or earlier (first mentions in white papers of the 4460 where in Feb 2009) but i am sure they where working on it before then.
Sent from my GT-I9100 using xda premium
veyka said:
There is no soc out yet apart from the A5 with the 543mp2, the lead time on a soc is huge, i mean they were designing the OMAP 4460 back in 2009 or earlier (first mentions in white papers of the 4460 where in Feb 2009) but i am sure they where working on it before then.
Sent from my GT-I9100 using xda premium
Click to expand...
Click to collapse
True. We know that the new A15s have been in development since at least 2009.
The Omap 5430 has a 544MPx; we don't know how many cores.
I suppose there was no alternative except the Exynos?
Sent from my SPH-D700 using xda premium
sauron0101 said:
True. We know that the new A15s have been in development since at least 2009.
The Omap 5430 has a 544MPx; we don't know how many cores.
I suppose there as no alternative except the Exynos?
Sent from my SPH-D700 using xda premium
Click to expand...
Click to collapse
Well there is Exynos, OMAP or snapdragon for current generation soc's.
OMAP and exynos are S9 cores. Snapdragon is kinda A8 with extra SIMD performance.
That's generally why snapdragon gets out performed clock for clock by A9+neon designs (that's why a 1.5ghz snapdragon eg sensation xl gets or tmob USA sgs2 is out performed by a 1.2ghz exynos.
I am more happy with OMAP than snapdragon that's for sure.
Sent from my GT-I9100 using xda premium
A lot of people seem to bemoaning the fact that this phone doesn't have a 1.5Ghz Exynos 4212 or even the 4210. The big worry is that the chip may not run well at 1280x720, hence the "lag" we saw in the leak videos.
There is disagreement on if the Mali 400 or the SGX 540 is better (at this clock anyways), but there seems to be a consensus that the Exynos is a faster CPU than the OMAP 4. I suppose that a few were hoping for a ARM Cortex A15 with a 2-core SGX 554. No such a SOC currently exists sadly.
I am also hopeful that there have been some software optimizations in Ice Cream that could improve performance.
Part of me wonders if Google should do what Apple did - get its own semiconductor design department and outsource the actual fab. It seems to be offering Apple a competitive advantage of sorts.
my thoughts are that i don't care.
eric b
veyka said:
Well there is Exynos, OMAP or snapdragon for current generation soc's.
OMAP and exynos are S9 cores. Snapdragon is kinda A8 with extra SIMD performance.
That's generally why snapdragon gets out performed clock for clock by A9+neon designs (that's why a 1.5ghz snapdragon eg sensation xl gets or tmob USA sgs2 is out performed by a 1.2ghz exynos.
I am more happy with OMAP than snapdragon that's for sure.
Sent from my GT-I9100 using xda premium
Click to expand...
Click to collapse
Agreed. Better Omap 4 than Scorpion.
Apparently there are also a few people who were hoping for a Tegra 3. It might have been doable (and I stress the might), as the new Asus Transformer Prime is rumoured to carry Kal El.
Sent from my SPH-D700 using xda premium
TBH the GPU and CPU are more than capable off handling the gui, its not like they are pulling out a fully 3D gui, even if the resolution is bumped the hardware should still be able handle it without breaking a sweat. It's only the games where the doubt arises.
sauron0101 said:
Agreed. Better Omap 4 than Scorpion.
Apparently there are also a few people who were hoping for a Tegra 3. It might have been doable (and I stress the might), as the new Asus Transformer Prime is rumoured to carry Kal El.
Sent from my SPH-D700 using xda premium
Click to expand...
Click to collapse
I am not sure if kal el is ready yet, i dont think the transformer prime is due till q1 2012, and I'm sure if the smartphone Tegra 3 is ready as well.
And Tegra 2 doesn't even have neon!
Sent from my GT-I9100 using xda premium

Galaxy Nexus' Memory Bandwidth and efficiency puts it ahead of the competition

There have been a lot of people who have been doing comparisons of various phones (particularly the Galaxy S2) to the Galaxy Nexus. I recalled during the Samsung/Google event them saying they chose to use an industry leading hardware inside the phone, so I decided to look into this a little further. As I'm sure you're well aware, the first thing people tend to point at are benchmarks. The gpu benchmarks are particularly what have come under fire when people make their comparisons. Though the mali 400 does bench out higher than the SGX540 the higher performance on the mali isn't a tangible benefit as "end device applications have not yet caught up with the highest graphics performance delivered by these" (http://armdevices.net/2011/10/26/interview-with-the-texas-instruments-omap4-team/). In other words that's like having a road with a 300mph speed limit but the cars are only able to achieve 120mph. Driving on the road with the 300mph speed limit won't get you there any faster than driving on a road with a 200mph speed limit if the speed of the car is the same.
As for processors, the processors as they are now are roughly on par with each other with them both being 45nm A9's clocked at 1.2. The difference between them is that the exynos 4210 is clocked at it's true clock speed at 1.2, whereas the omap 4460 is actually underclocked to 1.2 and has a true clock speed of 1.5. Thus meaning the processor has more speed potential than that of the exynos.
One thing that does stand out as an advantage of the omap 4460 over some of the competition is it's memory bus bandwidth. For those that don't know, in simple terms, it's how fast information can be read from and stored to memory by the processor. In other words, you can have the fastest processor in the world but if you don't have enough memory bandwidth to accommodate the amount of information that needs to be transferred then that speed won't matter because it will be bottlenecked. For example, let's say you have a car that can reach 200mph and you want to drive that car at full speed. However the street you're driving on can only handle 20 cars at a time, and you're the 21st car, well in this case you're going to be stuck in traffic. Sure you have the raw potential of doing 200mph, but you won't ever get close to that because of traffic congestion. The same concept applies when we're talking about memory bandwidth. That being said, the memory bandwidth on the omap 4460 is 6.4GB/s, the exynos 4210 is 6.4GB/s, the iphone 4s is 6.4GB/s, and the Tegra 2 is 2.5 GB/s. Add all of this with the fact that the TI processor is underclocked to 1.2ghz (for power savings) as opposed to running at full strength and full power (ie. Galaxy S2), you have what is in my opinion the superior processor. Personally, I'd rather have an underclocked processor that delivers the same or better performance and saves me power, than to have a gpu that has excessive power that I can't even make use of. It's kind of like saying you have a rocket launcher for self defense, sure the rocket launcher is powerful but it isn't something you can really make use of. In closing I think AnandTech said it best when they said "Until Tegra 3 and Krait show up, the CPU side of the 4460 is as good as it gets."
Sources:
http://www.samsung.com/global/business/semiconductor/productInfo.do?fmly_id=844&partnum=Exynos 4210
http://www.anandtech.com/show/2911
https://docs.google.com/viewer?a=v&...mA2j6n&sig=AHIEtbTE8LvpHXPUcE4w_wGU5apdbGD0Eg
http://armdevices.net/2011/10/26/interview-with-the-texas-instruments-omap4-team/
https://plus.google.com/105051985738280261832/posts/2FXDCz8x93s
http://www.phonearena.com/news/Why-...Galaxy-Nexus-Android-ICS-poster-child_id23089
http://www.anandtech.com/show/5133/galaxy-nexus-ice-cream-sandwich-initial-performance
Nice post!
Gave me some more insight in the decision for the Omap processor and makes me feel i made a real good decision ordering the GN.
Maybe some paragraphs in your text would be nice, to make it easier to read.
Kind regards.
Thanks, now I feel even better for chosing my Nexus....
wow that was very informative. all that confusion the past couple of months about why the g-nex would be using the TI OMAP 4460 instead of, what was believed to be more powerful processors like the Exynos. but this post really cleared that up for me.
thanks!
Thanks !! Nice and simple.
Woah, huge walloftext.jpg
Anyway, hate to break it to you, but the processors in Galaxy Nexus are 4460, but they're probably binned 1.5ghz processors i.e. processors that couldn't run at the full 1.5ghz
https://twitter.com/#!/coolbho3k/status/140218721774997504
https://twitter.com/#!/coolbho3k/status/140214089183010819
Oh, and this: https://plus.google.com/105051985738280261832/posts
Galaxy Nexus runs at significantly higher res than Galaxy S II, and has to push many many more pixels...
Rawat said:
Woah, huge walloftext.jpg
Anyway, hate to break it to you, but the processors in Galaxy Nexus are 4460, but they're probably binned 1.5ghz processors i.e. processors that couldn't run at the full 1.5ghz
https://twitter.com/#!/coolbho3k/status/140218721774997504
https://twitter.com/#!/coolbho3k/status/140214089183010819
Oh, and this: https://plus.google.com/105051985738280261832/posts
Galaxy Nexus runs at significantly higher res than Galaxy S II, and has to push many many more pixels...
Click to expand...
Click to collapse
I think you're missing my point. Yes the full speed potential of the 4460 chip is 1.5ghz, but it is clocked at 1.2ghz. Whether or not you will be able to clock the chip back up to 1.5 isn't what I'm getting at. The point is it has comparable processing power at a lower clock speed that it is spec'ed at. Meaning it can give comparable processing power as it's competitors but use less power, which obviously is a good thing. As for your last statement, you do realize that person you linked to is one of the sources I cited . The fact that so many more pixels have to be pushed is exactly why having more memory bandwidth is so important. Had they gone with an exynos processor clocked at 1.2 instead, there's a chance the user experience may have suffered. Think about this, the galaxy not has the exynos, but they had to clock it to 1.5 (instead of the 1.2) AND give it a 2500mah battery so that it can get decent battery life. Obviously they wouldn't be able to fit a battery of that capacity (or even close to it) in the galaxy nexus, so if they put a exynos in the nexus clocked that high, there would be a serious battery problem...and if they put a 1.2 in there with the lower memory bandwidth, there could also be a potential for user experience issues. The overall point that I'm making is that the 4460 is actually a very good chip due to the high memory bandwidth and the fact that it's more power efficient.
"More power efficient"
More power efficient than what? Exynos is pretty power efficient itself, and Note doesn't have such a large battery to counter the clock speed of Exynos, it's because it's a huge fricking phone, and they can fit it inside.
OMAP4460 is rather decidedly the 3rd best smartphone SoC in the market. A5 and Exynos are ahead (well, not CPU on a5, but GPU more than makes up for it) but it's better than Tegra2, and snapdragon. At least that's something, eh?
Awsome! Well explained, thank you, make sticky please!
Sent from my X10i using xda premium
Rawat said:
"More power efficient"
More power efficient than what? Exynos is pretty power efficient itself, and Note doesn't have such a large battery to counter the clock speed of Exynos, it's because it's a huge fricking phone, and they can fit it inside.
OMAP4460 is rather decidedly the 3rd best smartphone SoC in the market. A5 and Exynos are ahead (well, not CPU on a5, but GPU more than makes up for it) but it's better than Tegra2, and snapdragon. At least that's something, eh?
Click to expand...
Click to collapse
I will keep my response to you brief, since it's obvious you didn't read what I posted (judging from the fact that you posted link to the same person that I already had a link to in my sources). That being said, explain why the Exynos and the A5 are ahead. Instead of making a generalized please use some facts to support what you state. If that is what you think, I'd love to read why.
Rawat said:
Anyway, hate to break it to you, but the processors in Galaxy Nexus are 4460, but they're probably binned 1.5ghz processors i.e. processors that couldn't run at the full 1.5ghz
Click to expand...
Click to collapse
Hate to tell you this, but if it is indeed a 4460, these are 1.5 GHz parts. Plain and simple, if they weren't they would have a wildly different part number (think of Intel CPU's...the new I7 39xx series are binned Xeon parts...) than the one shown in the pictures or on the IC's themselves. Why? To put it bluntly, false advertisement. Every single thing online states 1.5 GHz for the part. No literature (that I can find) says anything less than that. And yes, I know a few things have the wording 'up to', but that doesn't change the fact that it's still a 1.5 GHz part. It just isn't rated for higher than 1.5 GHz. It's similar to how Apple clocks the A5 down to 800 MHz for the Iphone 4S. Get some power savings at the price of a small bit of performance. Does this mean that the A5's in the Iphone 4s can't do 1 GHz? Probably not.
Got proof of the accused binning? Then maybe I'll start considering that belief. But until I see 'real' proof, I highly doubt that TI is selling binned parts that can't make 1.5 GHz. That would kind of be pointless to say the 4460 is a 1.5 GHz part, but sell it with a max of 1.2 GHz without atleast changing the part number in some way (ie 4450 for instance).
mysterioustko said:
I think you're missing my point. Yes the full speed potential of the 4460 chip is 1.5ghz, but it is clocked at 1.2ghz. Whether or not you will be able to clock the chip back up to 1.5 isn't what I'm getting at. The point is it has comparable processing power at a lower clock speed that it is spec'ed at. Meaning it can give comparable processing power as it's competitors but use less power, which obviously is a good thing.
Click to expand...
Click to collapse
This is faulty reasoning. You are claiming that because the OMAP4460 in the GN is underclocked from 1.5GHz to 1.2GHz, it must consume less power than a Exynos 4210 clocked at 1.2GHz. This is only true if the OMAP4460 at 1.5GHz consumes the same amount of power as the Exynos 4210 at 1.2GHz. But we have no evidence that this is the case. The OMAP4460 at 1.5GHz might simply have a higher thermal envelope than the Exynos 4210 at 1.2GHz and is able to draw more power. Thus the OMAP4460 at 1.2GHz might consume power comparable to the Exynos 4210.
darkhawkff said:
Hate to tell you this, but if it is indeed a 4460, these are 1.5 GHz parts. Plain and simple, if they weren't they would have a wildly different part number (think of Intel CPU's...the new I7 39xx series are binned Xeon parts...) than the one shown in the pictures or on the IC's themselves. Why? To put it bluntly, false advertisement. Every single thing online states 1.5 GHz for the part. No literature (that I can find) says anything less than that. And yes, I know a few things have the wording 'up to', but that doesn't change the fact that it's still a 1.5 GHz part. It just isn't rated for higher than 1.5 GHz. It's similar to how Apple clocks the A5 down to 800 MHz for the Iphone 4S. Get some power savings at the price of a small bit of performance. Does this mean that the A5's in the Iphone 4s can't do 1 GHz? Probably not.
Got proof of the accused binning? Then maybe I'll start considering that belief. But until I see 'real' proof, I highly doubt that TI is selling binned parts that can't make 1.5 GHz. That would kind of be pointless to say the 4460 is a 1.5 GHz part, but sell it with a max of 1.2 GHz without atleast changing the part number in some way (ie 4450 for instance).
Click to expand...
Click to collapse
From the kernel code of the Galaxy Nexus arch/arm/mach-omap2/id.c:
Code:
if (cpu_is_omap446x()) {
si_type =
read_tap_reg(OMAP4_CTRL_MODULE_CORE_STD_FUSE_PROD_ID_1);
switch ((si_type & (3 << 16)) >> 16) {
case 2:
/* High performance device */
omap4_features |= OMAP4_HAS_MPU_1_5GHZ;
omap4_features |= OMAP4_HAS_MPU_1_2GHZ;
break;
case 1:
default:
/* Standard device */
omap4_features |= OMAP4_HAS_MPU_1_2GHZ;
break;
}
}
There appears to be something in the OMAP hardware that designates whether it is a "high performance device" or a "standard device". A standard device can only operate at 1.2GHz, not 1.5GHz. It is unclear if "device" here refers to the SoC or the phone. If it refers to the SoC, then it would suggest that the SoCs are binned into high and low performance categories, with the low performance devices incapable of performing at 1.5GHz.
But here's some preliminary evidence that the SoC itself may be missing something that's required for 1.5GHz to work: http://forum.xda-developers.com/showpost.php?p=19931580&postcount=97
mysterioustko said:
That being said, the memory bandwidth on the omap 4460 is 7.5GB/s the exynos 4210 is 6.4GB/s, and the Tegra 2 is a mere 2.5 GB/s.
Click to expand...
Click to collapse
Where are you getting that the memory bandwidth of the OMAP 4460 is 7.5GB/s? The Galaxy Nexus uses the Samsung K3PE7E700M-XGC1 1GB memory package, which is a 400MHz, LPDDR2, 32-bit dual-channel memory package. This means it has a memory bandwidth of 400 * 2 (for DDR) * 32 * 2 (for dual-channel) = 51200Mb/s = 6.4GB/s, same as the Exynos 4210.
See: http://www.samsung.com/us/business/oem-solutions/pdfs/PSG2011_web.pdf for details on the memory package used in the GN.
mysterioustko said:
I will keep my response to you brief, since it's obvious you didn't read what I posted (judging from the fact that you posted link to the same person that I already had a link to in my sources). That being said, explain why the Exynos and the A5 are ahead. Instead of making a generalized please use some facts to support what you state. If that is what you think, I'd love to read why.
Click to expand...
Click to collapse
I actually read your whole post, but skipped over reading the sources you had linked to.
A5 and Exynos are widely regarded as the 2 best SoC on the market (not including Tegra3, which just launched). Don't take my word for it, see anandtech here and here. But of course, it's hard to directly compare A5 to another (non-apple) SoC, because they run on different OSes
Rawat said:
I actually read your whole post, but skipped over reading the sources you had linked to.
A5 and Exynos are widely regarded as the 2 best SoC on the market (not including Tegra3, which just launched). Don't take my word for it, see anandtech here and here. But of course, it's hard to directly compare A5 to another (non-apple) SoC, because they run on different OSes
Click to expand...
Click to collapse
I don't care what it's "widely regarded" as. I asked for concrete information that supports what you state. You support your argument by stating that people's opinion of it is that it's best....that's not exactly a compelling argument.
I didn't read a word seeing as how it's all big one wall of text but I got the gist of it from the title. Yay my Nexus is even further ahead of the competition now than it was 20 seconds ago
mysterioustko said:
I don't care what it's "widely regarded" as. I asked for concrete information that supports what you state. You support your argument by stating that people's opinion of it is that it's best....that's not exactly a compelling argument.
Click to expand...
Click to collapse
Clock speeds aside, the A5 is clearly a better SoC. The CPUs in all three are pretty much the same, dual-core Cortex A9s on a 45nm process. It comes down to the GPU. The A5 has a better GPU simply because it's pretty much the multicore version of the GPU in the OMAP4460. Between the OMAP4460 and the Exynos 4210, it's more difficult to say. The PowerVR540 and Mali400 have different strengths and weaknesses, so I won't speculate here.
I would also suggest that you modify your original post. It contains quite a bit of misinformation and clearly many people have read it and taken it to heart. That's not a good thing, and I hope you will do the responsible thing and try to reverse the misinformation that you've spread.
Chirality said:
Clock speeds aside, the A5 is clearly a better SoC. The CPUs in all three are pretty much the same, dual-core Cortex A9s on a 45nm process. It comes down to the GPU. The A5 has a better GPU simply because it's pretty much the multicore version of the GPU in the OMAP4460. Between the OMAP4460 and the Exynos 4210, it's more difficult to say. The PowerVR540 and Mali400 have different strengths and weaknesses, so I won't speculate here.
I would also suggest that you modify your original post. It contains quite a bit of misinformation and clearly many people have read it and taken it to heart. That's not a good thing, and I hope you will do the responsible thing and try to reverse the misinformation that you've spread.
Click to expand...
Click to collapse
Omap vs Exynos ? The latter doesn't support HSPA+ or LTE.
Dmw017 said:
Omap vs Exynos ? The latter doesn't support HSPA+ or LTE.
Click to expand...
Click to collapse
This is hardly anything to do with "performance".....and not to mention that ICS is a TON more dependant on GPU renders....hardly a place it has any room to fall short.
Sent from my SAMSUNG-SGH-I777 using XDA App
Dmw017 said:
Omap vs Exynos ? The latter doesn't support HSPA+ or LTE.
Click to expand...
Click to collapse
Exynos will do 21Mbps, HSPA+
Sent from my GT-I9100 using Tapatalk
Chirality said:
Where are you getting that the memory bandwidth of the OMAP 4460 is 7.5GB/s? The Galaxy Nexus uses the Samsung K3PE7E700M-XGC1 1GB memory package, which is a 400MHz, LPDDR2, 32-bit dual-channel memory package. This means it has a memory bandwidth of 400 * 2 (for DDR) * 32 * 2 (for dual-channel) = 51200Mb/s = 6.4GB/s, same as the Exynos 4210.
See: http://www.samsung.com/us/business/oem-solutions/pdfs/PSG2011_web.pdf for details on the memory package used in the GN.
Click to expand...
Click to collapse
You are correct, I mistakenly read the the 4470's bandwidth when researching the 4460 (the 4470's bandwidth was referenced in the same TI interview where they discussed the 4460).

[Q] HOX beats S3?

i really wanna know:
i got 1,5 ghz processor + Graphic Card (4+1)
i got design
i got beats
i got sense
S3 got:
1,4 ghz processor
Amoled...
Touchwiz
Battery ( 2200 mAh) with low energy usage Cpu
İ see benchmark points pass the S3 when JB comes
5 min ago i installed geekbench 2
my phone get 1300 benchmark points (power saving off) but s3 got 1700 -.-
its one of most important thing for me: HAVE BETTER PROCESSOR THEN S3
can anyone explain to me HOX have better cpu or not?
and what about iphone 5? is it beat hox too? with dual core??
This thread will be closed.
The HOX and S3 are basically on a par with each other. However the S3 just edges out the HOX for a few different reasons. First, most software needs further optimising for Tegra devices and secondly because the S3 doesn't have the S-On/S-Off problem.
It's worth noting that the HOX is closer in terms of following Google's phone design guidelines (no menu button) and also that the screen is better.
The Tegra3 version of HOX has a slower CPU than it's dual core version.
It's like comparing a Q8200 with a E8600 and then run dual core optimized programs.
Hmmm thx for explain
Sent from my HTC One X using Tapatalk 2
Bassicaly there isnt much of a big difference.. Exynos is a bit faster(effective)than tegra 3.. So s3 has worse screen while htc has better.. S3 has amoled display while htc has only lcd display.. But they r kinda same.. Used both phone sense is good but still wud go for touchwiz.. Lack of toggles in notification menu rly bugs me.. (There r no official htc toggles there r just play stores one)..It is just my opinion
Sent from my GT-I9300 using xda premium
I have the One X but I think the S3 is better overall
The S3 has a faster processor, smoother gaming and UI, better quality camera, touchwiz features (pop up play, multi-window, smart rotation) and most importantly a great XDA thread
However the HOX does have a much better (& sturdier) design, better screen, THD games support, Beats audio
Headless_monkeyhunta96 said:
I have the One X but I think the S3 is better overall
The S3 has a faster processor, smoother gaming and UI, better quality camera, touchwiz features (pop up play, multi-window, smart rotation) and most importantly a great XDA thread
However the HOX does have a much better (& sturdier) design, better screen, THD games support, Beats audio
Click to expand...
Click to collapse
Regarding to beats.. Ok so when the driver is on the sound is just amazing it blows any other phone when it comes to that..But,but,but i think that they demolished sound when u play music without beats drivers just so u can say that the difference is sooo big.. And it is a good marketing trick gotta admit that.
Sent from my GT-I9300 using xda premium
The amount of stupidity in this thread is unbelievable...
The S3 beats the Tegra 3 even though it has a slightly slower clock speed for a few reasons, mainly because the Exynos chip in the S3 is a 32nm chip as compared to the Tegra 3's 40nm process, so the Exynos is somewhat more efficient due to the smaller process. Also, blame Nvidia for crappy software optimisation. Furthermore, the Mali 400 chip in the S3 is far more powerful than the puny Tegra 3 ULP Geforce chip. Don't say more cores = more power, that is not true. Besides, the S4 in the HOXL is more powerful than the Tegra 3 because the S4 has the Cortex A15 architecture which gives about 40% more processing power per core against the Cortex A9. The comparison of a dual core CPU and a quad core CPU using a dual core optimised software I saw somewhere above in this thread means nothing in ARM terms. The Cortex A9 (for example Tegra 3) uses all 4 cores and loses against the Snapdragon S4, say the MSM8960 which the HOXL has.
All other discussions about S3 and HOX w.r.t. features (touchwiz is a feature?!) should be reserved for other threads.
Sent from my HTC One X using Tapatalk 2
pandaball said:
The amount of stupidity in this thread is unbelievable...
The S3 beats the Tegra 3 even though it has a slightly slower clock speed for a few reasons, mainly because the Exynos chip in the S3 is a 32nm chip as compared to the Tegra 3's 40nm process, so the Exynos is somewhat more efficient due to the smaller process. Also, blame Nvidia for crappy software optimisation. Furthermore, the Mali 400 chip in the S3 is far more powerful than the puny Tegra 3 ULP Geforce chip. Don't say more cores = more power, that is not true. Besides, the S4 in the HOXL is more powerful than the Tegra 3 because the S4 has the Cortex A15 architecture which gives about 40% more processing power per core against the Cortex A9. The comparison of a dual core CPU and a quad core CPU using a dual core optimised software I saw somewhere above in this thread means nothing in ARM terms. The Cortex A9 (for example Tegra 3) uses all 4 cores and loses against the Snapdragon S4, say the MSM8960 which the HOXL has.
All other discussions about S3 and HOX w.r.t. features (touchwiz is a feature?!) should be reserved for other threads.
Sent from my HTC One X using Tapatalk 2
Click to expand...
Click to collapse
The amount of stupidity in this is ridiculous.
Snapdragon S4 doesn't use Cortex A15, it uses Krait cores.
Furthermore, the S4 beats Tegra 3 because at the time benchmark reviews came out, most of them were optimized for dual core, the T3 beats S4 in terms of raw power, of course software will do its part.
Mali-400 doesn't really beat the GeForce. Its running in 16 bit mode, and its vertex limited. Sure it has a good fillrate, but it cannot rim vertex heavy games.
GeForce ULP runs in 32 bit, and is pixel limited, which basically means its a draw, but when you factor in THD games, GeForce wins.
Sent from my faster than SGS3 HOX.
XxVcVxX said:
The amount of stupidity in this is ridiculous.
Snapdragon S4 doesn't use Cortex A15, it uses Krait cores.
Furthermore, the S4 beats Tegra 3 because at the time benchmark reviews came out, most of them were optimized for dual core, the T3 beats S4 in terms of raw power, of course software will do its part.
Mali-400 doesn't really beat the GeForce. Its running in 16 bit mode, and its vertex limited. Sure it has a good fillrate, but it cannot rim vertex heavy games.
GeForce ULP runs in 32 bit, and is pixel limited, which basically means its a draw, but when you factor in THD games, GeForce wins.
Sent from my faster than SGS3 HOX.
Click to expand...
Click to collapse
I facepalmed. Especially at the very first statement. What architecture does Krait use, I wonder.
As for S4 not beating Tegra, S4 came out *after* Tegra 3. You'd think benchmarks would be optimised for quad cores before they became optimised for Cortex A15.
As for the last one, you forgot that ULP Geforce is not superscalar. The GPU cores have to wait for the first instruction to complete before the next one can process, making the process slow as hell. Mali is far more powerful than Tegra (just look at benchmarks), because the GPU cores are far beefier than the Tegra GPU cores, and also because ULP Geforce is based on Fermi cores which are a bit old and slow at this point.
Sent from my HTC One X using Tapatalk 2
pandaball said:
I facepalmed. Especially at the very first statement.
Sent from my HTC One X using Tapatalk 2
Click to expand...
Click to collapse
Read on Wikipedia more. Krait isn't Cortex A15.
Sent from my faster than SGS3 HOX.
XxVcVxX said:
Read on Wikipedia more. Krait isn't Cortex A15.
Sent from my faster than SGS3 HOX.
Click to expand...
Click to collapse
Its A15 plus Qualcomm enhancements...
Sent from my HTC One X using Tapatalk 2
Krait is a custom architecture made by Qualcomm. Its similar to A15, but its not A15, and performance sits between A9 and A15, bit its more power efficient than A15.
Its like Scorpion, where the performance was between A8 and A9.
As for GeForce ULP running with Fermi, you're wrong. Its definitely not Fermi since it still has seperate vertex and pixel cores, so its even before GT200.
Mali-400 is old, and its not beefier than GeForce. Samsung made it up to par by overclocking extensively and forcing 16 bit rendering on the thing. It ****s on GeForce on pixel fill rate, but GeForce ****s on it on vertex output, so its kinda a draw.
Most games run smoother on Mali because most applications on Play Store is optimized for the biggest phone company : Samsung. You can see how Gameloft downright ignored Tegra.
EDIT: The ULP is using NV47
Sent from my faster than SGS3 HOX.
XxVcVxX said:
Krait is a custom architecture made by Qualcomm. Its similar to A15, but its not A15, and performance sits between A9 and A15, bit its more power efficient than A15.
Its like Scorpion, where the performance was between A8 and A9.
As for GeForce ULP running with Fermi, you're wrong. Its definitely not Fermi since it still has seperate vertex and pixel cores, so its even before GT200.
Mali-400 is old, and its not beefier than GeForce. Samsung made it up to par by overclocking extensively and forcing 16 bit rendering on the thing. It ****s on GeForce on pixel fill rate, but GeForce ****s on it on vertex output, so its kinda a draw.
Most games run smoother on Mali because most applications on Play Store is optimized for the biggest phone company : Samsung. You can see how Gameloft downright ignored Tegra.
EDIT: The ULP is using NV47
Sent from my faster than SGS3 HOX.
Click to expand...
Click to collapse
To use slightly crude terms, Qualcomm licensed A15 from ARM, then beat it with sticks until it became more optimised. Qualcomm has a slightly different license from ARM which allows them to take the design by ARM, beat it into shape then sell it.
As for Fermi in Tegra, I was mistaken. I didn't refer to anything, and my offhand memory sucks.
For Mali vs Tegra, refer to this: http://m.gsmarena.com/snapdragon_s4_pro_benchmarked_crushes_older_chipsets-news-4563.php. Look at the benchmark list, particularly GLbenchmark offscreen since its the most relevant.
Sent from my HTC One X using Tapatalk 2
pandaball said:
To use slightly crude terms, Qualcomm licensed A15 from ARM, then beat it with sticks until it became more optimised. Qualcomm has a slightly different license from ARM which allows them to take the design by ARM, beat it into shape then sell it.
As for Fermi in Tegra, I was mistaken. I didn't refer to anything, and my offhand memory sucks.
For Mali vs Tegra, refer to this: http://m.gsmarena.com/snapdragon_s4_pro_benchmarked_crushes_older_chipsets-news-4563.php. Look at the benchmark list, particularly GLbenchmark offscreen since its the most relevant.
Sent from my HTC One X using Tapatalk 2
Click to expand...
Click to collapse
Its offscreen test, as I have stated before, GeForce is pixel limited, at HD resolutions, it becomes less than Mali, however remember Mali is running at 16 bit.
Sent from my faster than SGS3 HOX.
XxVcVxX is right.. we have discuss about this in hamdir thread for a long time and if s3 is running 32bit like us.. it just the same as ours..
BTW.. both phone have pros and cons.. so just choose any we,you,he or her like..
XxVcVxX said:
Its offscreen test, as I have stated before, GeForce is pixel limited, at HD resolutions, it becomes less than Mali, however remember Mali is running at 16 bit.
Sent from my faster than SGS3 HOX.
Click to expand...
Click to collapse
In offscreen 720p, Mali still (overall) eats Tegra, although what you said is correct, which makes me wrong. Therefore, I accept defeat and bestow my RD status to you
Sent from my HTC One X using Tapatalk 2
pandaball said:
In offscreen 720p, Mali still (overall) eats Tegra, although what you said is correct, which makes me wrong. Therefore, I accept defeat and bestow my RD status to you
Sent from my HTC One X using Tapatalk 2
Click to expand...
Click to collapse
Lol wut.
I accept this honor, and I thank my friends and family for supporting me, and most of all, I thank pandaball for arguing with me XDXD
Sent from my faster than SGS3 HOX.
XxVcVxX said:
Lol wut.
I accept this honor, and I thank my friends and family for supporting me, and most of all, I thank pandaball for arguing with me XDXD
Sent from my faster than SGS3 HOX.
Click to expand...
Click to collapse
But seriously, thanks. I learnt something, although losing to a stranger in an argument online on my birthday is totally the best way to start my year
Sent from my HTC One X using Tapatalk 2
pandaball said:
But seriously, thanks. I learnt something, although losing to a stranger in an argument online on my birthday is totally the best way to start my year
Sent from my HTC One X using Tapatalk 2
Click to expand...
Click to collapse
Hell, if I knew you were RD I wouldn't be so aggressive xD
Damned mobile app.
Happy Birthday
Sent from my faster than SGS3 HOX.

Why octa-core?

The galaxy tab s products that are available to me have an octa-core processor, with the high speed cores being 1.9ghz. I can't really understand why Samsung chose to use that instead of a 2.3ghz quad-core like in the tab pro.
See Wikipedia for an explanation of the concept: http://en.m.wikipedia.org/wiki/ARM_...multi-processing_.28global_task_scheduling.29
Because the Exynos 5 Octa-core is the one processor that Samsung has to be able to compete with Snapdragon 800, and is cheaper to implement since it's their own processor. I don't buy the Octa-core hype, I'd be happier with the Snapdragon 800 honestly like on the Tab PRO 8.4.
The question is:
Does TAB S use the 8 cores at the same time?
It seams it does NOT, little cores are only used when low power is required..
So performance wise, this CPU is slower than SD 800
ssuper2k said:
The question is:
Does TAB S use the 8 cores at the same time?
It seams it does NOT, little cores are only used when low power is required..
So performance wise, this CPU is slower than SD 800
Click to expand...
Click to collapse
And yet I am getting 35,300 on Antutu using Shaheers t800 rom which is higher than any other current tablet or phone. (Shaheer's rom should go out of beta today - don't flash until final has been posted).
The Tab Pro 8.4 Antutu is 32,806.
I CANT PLAY NOVA 3 with exynos !
AND GAMING IS NOT SO SMOOTH ! STILL A BIT LAGGY
I can see the argument that you don't always need full power, thus the four slow cores, but since all cores can't run at once, it seems a cheat to have 1.9ghz as the top speed for the faster four cores. Since, or at least I assume, cores step up and down as needed, it seems to me a snapdragon 800 or higher at 2.3ghz or higher would have been just fine. I mean, if you are going to put in 3gb of RAM, then you should put in a great cpu also and not pretend less (1.9ghz) is a better contribution to what is supposed to be a premium tablet.
And yet I don't think samsung is doing enough to utilizing this hardware capability. In theory it should run at least 4x faster and 6x more effecient then the snap dragon and apple current A8 chip. It has failed to outshine the competitors because samsung software department sucks. Samsung hardware is still great though.
sku|| said:
I CANT PLAY NOVA 3 with exynos !
AND GAMING IS NOT SO SMOOTH ! STILL A BIT LAGGY
Click to expand...
Click to collapse
Blame the developer for not making it compatible. Tegra powered Htc one x is incompatible too so not sure that is exynos issue..
i wish my t805 had Full HD screen resolution :cyclops:
Funny. Was just browsing the web a bit on my i5 ultrabook and it occurred to me that the browser on my Tab S is actually faster. If gaming is your primary thing, I'd buy the Nvidia Shield, not the Tab S. This tablet is designed for eye candy media consumption (internet and video) not for gaming enthusiasts. Try running your PC video card at 2560 x 1600 on ultra and see what you get.
i had heard from a Samsung rep i actually enjoy talking to that Sammy had just figured the all cores at once and we should see updates that turn that feature on. when this will happen who knows. i also did not ask him for a link and now cant find that info on the web so when i see him again soon i will get more info.
i would assume (insert you know what that means) that when/if this happens the full power of this setup would greatly improve?
anyway i have had my Tab S running snappy for me and no complaints at this time
You cannot compare the clock speeds from two different processors. For instance, you can't compare the 1.9GHz quad-core of the Exynos to the 2.3GHz quad-core of the Snapdragon 800. This doesn't mean anything. If you compare the clock speed of two Snapdragon chips, that's ok, or if you compare the clock speed of two Exynos chips, then that's ok too. Comparing the clock speed of an Intel chip against the clock speed of an AMD chip, is the same as comparing the clock speed of an Exynos chip to the clock speed of a Snapdragon chip.
The Exynos chip in this tablet has been shown to compete very well/close with the Snapdragon on every level except GPU. The Mali GPU in this chip just doesn't match the Adreno GPU from the Snapdragon. However, the RAM is faster in the Exynos than the Snapdragon.
That said, I am a fan of the Snapdragon chip, of course. I was holding off to see if the LTE variant of this tablet would have the Snapdragon 800, but instead they shipped with an Intel LTE modem. Besides apps/games not being optimized for Exynos, I am fairly satisfied with my purchase. I'm just anxious to get CyanogenMod(or any other AOSP ROM installed on it).
fletch33 said:
i had heard from a Samsung rep i actually enjoy talking to that Sammy had just figured the all cores at once and we should see updates that turn that feature on. when this will happen who knows. i also did not ask him for a link and now cant find that info on the web so when i see him again soon i will get more info.
i would assume (insert you know what that means) that when/if this happens the full power of this setup would greatly improve?
anyway i have had my Tab S running snappy for me and no complaints at this time
Click to expand...
Click to collapse
Could also mean increased battery consumption,don't know. Overall I am satisfied with this Tab including battery life.
There are 3 different performace results:
a) what Exynos 520 does achieve in practice now, measured bei some benchmarks and real world performance (<= Snapdragon 800)
b) what it could do theoretically - but will never happen due to driver and scheduler etc issues (>> Snapdragon)
c) what it will do some day in near future on an optimized ROM (somewhere in between?)
Fortunately the Exynos 5420 does support all 8 cores in parallel, see here:
http://www.notebookcheck.net/Samsung-Exynos-5420-Octa-SoC.103633.0.html
pibach said:
There are 3 different performace results:
a) what Exynos 520 does achieve in practice now, measured bei some benchmarks and real world performance (<= Snapdragon 800)
b) what it could do theoretically - but will never happen due to driver and scheduler etc issues (>> Snapdragon)
c) what it will do some day in near future on an optimized ROM (somewhere in between?)
Fortunately the Exynos 5420 does support all 8 cores in parallel, see here:
http://www.notebookcheck.net/Samsung-Exynos-5420-Octa-SoC.103633.0.html
Click to expand...
Click to collapse
Wish I knew how. Probably a linux thing. ...
If it is possible to implement in today's existing source, I'm sure @AndreiLux would know about it ?
UpInTheAir said:
Wish I knew how. Probably a linux thing. ...
If it is possible to implement in today's existing source, I'm sure @AndreiLux would know about it ?
Click to expand...
Click to collapse
It's impossible.
AndreiLux said:
It's impossible.
Click to expand...
Click to collapse
What and why?
pibach said:
What and why?
Click to expand...
Click to collapse
http://www.androidauthority.com/sam...ta-can-use-eight-cores-simultaneously-267316/
I've found a few articles saying it should support it, then a couple Deva saying they had to goto the 5422 for a working implementation of HMP.
Here is a post from odroid
http://forum.odroid.com/viewtopic.php?f=97&t=5651
That's weird. The (newer) 5422 supports HMP but not 3gb RAM.

SD821 Underclocked in pixel devices

The pixel XI and the pixel are packed with snadragon 821 chipset wich supposed to be clocked at 2*2.35 kryo & 2*2.0 kryo but both pixel phones are clocked at 2*2.15 kryo & 2*1.6 Kryo which is exactly the same as SD820 on Lg G5 and the s7 so if someone knows what is the difference between the cpu in the pixel phones and the regular snapdragon 820 please write it down
From what I have read the 821 is a 820. The 821 is just higher binned 820. When they make chips they are not all the same. Some just are a little more efficient than others do to very minor differences in the chips. So a high binned 820 can handle a higher clock speed while using less power are turned into 821.
So Google decided they wanted to go with the 821 because it is more power efficient than a 820. But it seems Google thinks the speed of the 820 is fast enough to provide a good snappy user experience. So they are doubling down on efficiency by clocking these high binned chips down to the same as the 820. So say the 821 is 5% more efficient at stock speed over the 820. The 821 might be 10% more efficient at the same clock speed at the 820 while delivering the same speed as the 820. So they are sort of doubling down on efficiency over performance.
From the hands on I have seen everyone has described the phone as very fast. This is likely due to Google optimizing Android to run on the pixels hardware. Much like Apple does with the iPhone. Also the Pixel has some hardware features that might not show up on a regular spec sheet. It has some improved touch screen latency and faster storage. Because of these factors Google decided they don't need the extra performance of the 821 but instead want to utilize it's efficiency.
TLDR Google is going all in on the Pixel proving a very fast user experience while being power efficient!
So in theory once kernel source has been released we can just OC it back to "stock" frequency and get even faster performance with a hit to battery life.
I have the OP3 and the phone is clocking to max. frequency very rarely anyway. So there is no reason to clock it down for better efficiency.
So basically the pixel xl nd the pixel have snadragon 820 with a different name and better efficiency, as a result the gaming performance is the same as on the lg g5 or the s7 for example, these pixel devices arent worth the extra 200$
ramqashou said:
So basically the pixel xl nd the pixel have snadragon 820 with a different name and better efficiency, as a result the gaming performance is the same as on the lg g5 or the s7 for example, these pixel devices arent worth the extra 200$
Click to expand...
Click to collapse
The smaller Pixel has the potential to out do both of those phones and the Pixel XL in gaming since it has a native resolution of 1080p. The lower the resolution, the higher frames per second possible in games when using the same SoC, assuming the game is made to run at your phones native resolution.
ramqashou said:
So basically the pixel xl nd the pixel have snadragon 820 with a different name and better efficiency, as a result the gaming performance is the same as on the lg g5 or the s7 for example, these pixel devices arent worth the extra 200$
Click to expand...
Click to collapse
In the current climate and with the 810 fiasco overshadowing can you really blame them for dialing it down? Perhaps the GPU is still clocked higher in the 821 and I'll take the efficiency as a perk. It's up to you what's worth $200 more but there are a few more bits less talked about included in the price.
---------- Post added at 07:11 AM ---------- Previous post was at 07:09 AM ----------
mixedguy said:
The smaller Pixel has the potential to out do both of those phones and the Pixel XL in gaming since it has a native resolution of 1080p. The lower the resolution, the higher frames per second possible in games when using the same SoC, assuming the game is made to run at your phones native resolution.
Click to expand...
Click to collapse
I'd rather have 1080p at 60FPS than 2k at 30FPS on a screen that size, however I think most games, at least the big titles, have adjustable resolution so I think the only difference will be battery draw.
Hoodeddeathman said:
I'd rather have 1080p at 60FPS than 2k at 30FPS on a screen that size, however I think most games, at least the big titles, have adjustable resolution so I think the only difference will be battery draw.
Click to expand...
Click to collapse
I agree, I wasn't aware you could choose your resolution in mobile phone games as I don't really play demanding games on my phone, I assumed it was like mainstream game consoles where the developer predetermines the resolution or just sets it to use the native res by default.
I play games on PC, so it's pretty cool that you can change the resolution on mobile phone games like you can on PC games.
mixedguy said:
I agree, I wasn't aware you could choose your resolution in mobile phone games as I don't really play demanding games on my phone, I assumed it was like mainstream game consoles where the developer predetermines the resolution or just sets it to use the native res by default.
I play games on PC, so it's pretty cool that you can change the resolution on mobile phone games like you can on PC games.
Click to expand...
Click to collapse
As I understand it android has the capability and it's up to the devs to implement. The game can be rendered at whatever resolution and will then be upscaled. for example Warhammer Freeblade allows you to select which resolution to use and texture qualities just as you would in most PC games at the risk of losing frames however Need For Speed No Limits selects a pre-defined profile depending on device.
As I said, underclocking doesn't automatically mean better effiency... If you would have a 820 phone you would know that. I experimented a lot with different CPU settings on my One Plus 3 and underclocking is not worth it because it only cuts of performance but does NOT increase effiency because your CPU is using max frequency like 1% of the runtime anyway... In more than two days 2,15 GHz on the big cluster was used only 49s on my OP3.
And that the 821 reaches a higher frequency doesn't automatically mean that the CPU has a higher quality. I know it would be possible that the 820s are only bad 821s that don't surpass quality tests but I don't think so because the 820 was released much earlier. Usually it goes the other way around, like on GPUs. Nvidia first releases the very high end models and then sells the crappy GPUs in the lower end models. I don't think that Qualcomm is like, hey we are picking out all really good 820s and pile them up to sell them as 821s... A 821 could be better and more efficient but it's not necessarily true. A good 820 could still be as good or even better than a 821, regarding effiency. Also think about AMD Processors a few years ago, whole cores where unlock able and there was still room for OC if you were lucky.
Gerrit507 said:
As I said, underclocking doesn't automatically mean better effiency... If you would have a 820 phone you would know that. I experimented a lot with different CPU settings on my One Plus 3 and underclocking is not worth it because it only cuts of performance but does NOT increase effiency because your CPU is using max frequency like 1% of the runtime anyway... In more than two days 2,15 GHz on the big cluster was used only 49s on my OP3.
Click to expand...
Click to collapse
When talking about efficiency I'm referring more to undervolting as appose to underclocking, it may be the case that they have chosen those frequencies because the 821 steps up in voltage beyond that point thus increasing power consumption and heat. We'll have to wait and see how the Pixel performs, but if that underclock means the thermal load is capped lower we will also see less throttling, ideal for daydream.
As an example I would refer to overclocking desktop CPUs, the architecture is different but how it responds to heat and power is not. beyond a certain frequency the CPU requires exponentially more power and generates exponentially more heat the higher you go.
http://m.gsmarena.com/google_pixel_xl_benchmark_doesnt_show_performance_improvement-news-20927.php
This benchmark proves to all those who insist that the chipset in the pixel phones is better than the original snadragon 820
It might be only to reduce the heat. The battery efficiency is, IMO, very marginal.
But I will surely put back the 2.4GHz on mine.
firewave said:
It might be only to reduce the heat. The battery efficiency is, IMO, very marginal.
But I will surely put back the 2.4GHz on mine.
Click to expand...
Click to collapse
For some reason idont believe in overclocking, cuz it's beyond the device capabilities and it might cause some problems.
This seems a very big piece of marketing by Google. It isn't really an SD 821, its an SD820.
The 821 only has a 10% performance increase when clocked at its max frequency, so even if Google did leave it at its max frequency, a 10% increase would be barely noticeable, if noticeable at all in real world use.
The 821 does have some features that aren't available on the 820, which is why Google probably chose the 821 over the 820. I found this info about two important features for the 821, that's not found in the 820 and quoted it below.
"One of the main reason why Google used the Snapdragon 821 in the Pixel phones is the Snapdragon VR SDK (Software Development Kit). This is entirely unavailable with the Snapdragon 820. The new SDK comes with advanced VR toolset to give the developers broad access to the internal architecture of the Snapdragon 821 chipset. This is extremely useful and fully compatible with Google Daydream platform. The VR SDK helps in the rendering of cutting-edge visual and audio which helps in state of the art Virtual Reality experience."
"Another important thing which is unknown for most people is about the camera improvements brought by the MSM8996 Pro. The SoC can simultaneously use two phase detectors for significant improvement in focussing quality and time. On the contrary, the Snapdragon 820 or MSM8996 only supports single PDAF (Phase Detecting Auto Focus) systems. The newer chipset extends the range of laser autofocus technology. This will substantially boost the laser-assisted autofocus systems of upcoming smartphones."
ramqashou said:
For some reason idont believe in overclocking, cuz it's beyond the device capabilities and it might cause some problems.
Click to expand...
Click to collapse
That's a very incorrect statement. The kernel determines the clock speed. Google could choose something like 0.5GHz if they were so inclined. The phone would run like ****, but in your eyes, the device is not "capable" of anything faster. It sounds like Google purposely underclocked these. If nothing else, you are absolutely 100% fine to clock it back to the speed that Qualcomm, the OEM of the chipset, intended it to run at. True overclocking can present problems, but I have overclocked my CPUs, RAM, and GPUs for YEARS with no issues and reaped plenty of extra benefits in terms of performance. I used to do it on my smartphones too, but it is pointless and wastes battery for almost every use scenario.
Google specifically chose 2.15GHz instead of 2.4GHz as specified by Qualcomm, either due to heat issues or battery life benefit. I am going to guess they realized that their incredibly light and optimized software does not need a 2.4GHz CPU speed - hell, my 6P is faster with a SD 810 than my Note7 with an 820 in day to day use for a reason, that reason being stock Android is incredibly quick and efficient.
That is true from the chip standpoint. What you don't know, though, is if google/htc designed the heat removal system to handle the additional heat produced at full clock speeds without throttling...
Sent from my Nexus 6P using Tapatalk
JasonJoel said:
That is true from the chip standpoint. What you don't know, though, is if google/htc designed the heat removal system to handle the additional heat produced at full click speeds without throttling...
Sent from my Nexus 6P using Tapatalk
Click to expand...
Click to collapse
The phone being a uni-body aluminium shell should help with that. My 5X gets mega hot when I run games or for extended screen on times, but the back is plastic. I think using the whole surface of the phone as an additional heat-sink so to speak could help with heat dissipation.
Either way - I hope someone tries to OC it back to "stock" qualcomm speeds. I will certainly try to see the results, that is, if custom kernels can be a thing with the Pixel.
Nitemare3219 said:
That's a very incorrect statement. The kernel determines the clock speed. Google could choose something like 0.5GHz if they were so inclined. The phone would run like ****, but in your eyes, the device is not "capable" of anything faster. It sounds like Google purposely underclocked these. If nothing else, you are absolutely 100% fine to clock it back to the speed that Qualcomm, the OEM of the chipset, intended it to run at. True overclocking can present problems, but I have overclocked my CPUs, RAM, and GPUs for YEARS with no issues and reaped plenty of extra benefits in terms of performance. I used to do it on my smartphones too, but it is pointless and wastes battery for almost every use scenario.
Google specifically chose 2.15GHz instead of 2.4GHz as specified by Qualcomm, either due to heat issues or battery life benefit. I am going to guess they realized that their incredibly light and optimized software does not need a 2.4GHz CPU speed - hell, my 6P is faster with a SD 810 than my Note7 with an 820 in day to day use for a reason, that reason being stock Android is incredibly quick and efficient.
Click to expand...
Click to collapse
That's true i can't
Deny the power of stock android, but there are many other OEM custom skins that are well optimized and are plenty fast such as sense and Lg ux 5.0 and even the oxygen OS

Categories

Resources