[Q] Qualcomm question.. - G2 and Desire Z Q&A, Help & Troubleshooting

First of all Im new to the forum so hello lol
I have a question about qualcomm...I know that the scorpion processor in the next generation 45 nm chip but what I dont understand is why is it clocked at 800mhz?
The made it such a big deal crossing the 1ghz mark and I would assume that they would never look back. I might not have my facts straight but dosent the 45 nm run clock-cycles just like a 65 nm but more efficiently in respect of battery consumption.
It looks as if they wanted to get in on the low-end Android market share. As for the G2 which I love, my friend ran quadrant pro infront of me and it shows the cpu was scored what looked like 20% lower than the nexus one at 2.2 (my guess from looking at it) but of coarse the gpu trashed the nexus one.
I saw a interview on engadget about a week ago about googles executive that said soon there will be a clear distinction or line between low end Android and high end Android devices. I wonder if HTC is in contact with google about future updates in order to release devices adequate enough to run them or are they just blindly releasing high build quality devices lol
Sorry about the long post but I had come up with a few questions that I didn't want to ask anywhere else.
Thanks.

Wait until our geniuses figure out root then you can happily run it a 1 Ghz+. If you look at the spec scheets for the MSM7230 then you will see it's rated for speeds 800-1000. Higher speeds=lower battery life, so the reasons for having it clocked lower are very practical. My G2 can average 1600-1650 on quadrant, so I don't think it's that bad.

azzeh3 said:
First of all Im new to the forum so hello lol
I have a question about qualcomm...I know that the scorpion processor in the next generation 45 nm chip but what I dont understand is why is it clocked at 800mhz?
The made it such a big deal crossing the 1ghz mark and I would assume that they would never look back. I might not have my facts straight but dosent the 45 nm run clock-cycles just like a 65 nm but more efficiently in respect of battery consumption.
It looks as if they wanted to get in on the low-end Android market share. As for the G2 which I love, my friend ran quadrant pro infront of me and it shows the cpu was scored what looked like 20% lower than the nexus one at 2.2 (my guess from looking at it) but of coarse the gpu trashed the nexus one.
I saw a interview on engadget about a week ago about googles executive that said soon there will be a clear distinction or line between low end Android and high end Android devices. I wonder if HTC is in contact with google about future updates in order to release devices adequate enough to run them or are they just blindly releasing high build quality devices lol
Sorry about the long post but I had come up with a few questions that I didn't want to ask anywhere else.
Thanks.
Click to expand...
Click to collapse
MHz is a misleading statistic to judge performance by. A 1GHz processor is not guaranteed to be faster than an 800MHz one. What is always true, though, is that a given processor running at 1GHz will take more energy than the same processor running at 800MHz. Because these new processors are so fast, even at 800MHz, they are competitive or even faster than any other phone on the market today. However by clocking them a bit slower, they also have quite serviceable battery life.
In daily use, it is unlikely that you would notice much difference between 800MHz and 1GHz with this cpu. They are both plenty fast, and most of the time the CPU will be waiting for you, not the other way around. However if it were clocked at 1GHz, you would immediately see a shorter battery life, so to my mind the sacrifice is well worth it.
As far as any fears that this might be a "low end phone", make no mistake, this is the premiere Android phone on the market right now. The Droid 2 may have a higher profile, but it is slower, has a slower network, and uses a non-standard GUI (not to mention a whole lot more expensive when you factor in the price of the service). No phone will have every possible feature that people want, but as far as raw capabilities go, there is no better phone on the market today as far as I can see.

Those seem to be very valid points and thanks for the input..
One other thing when exactly do you need 1ghz of processing speed?? I mean back in the day the macbook air used a 1.5 ghz processor..
Also where dose the ram come into play?

azzeh3 said:
Those seem to be very valid points and thanks for the input..
One other thing when exactly do you need 1ghz of processing speed?? I mean back in the day the macbook air used a 1.5 ghz processor..
Also where dose the ram come into play?
Click to expand...
Click to collapse
RAM is a far bigger determiner of performance on a day-to-day basis than processor speed, at least to a point. You could have a 10GHz computer, but if you only had 64k of RAM it would crawl along miserably slow.
In either case, there is a point of diminishing returns, though as applications become more demanding that point gets higher and higher. I remember selling Mac's back in the early-mid 90's and telling eople, don't worry, 16Megs of RAM (a huge amount back then when the standard was 4MB) would handle anything they could throw at it. Little did I know that just 15 years later I would have 500x that much in my desktop and 32x as much in my cell phone!
I am by no means a Andoid systems expert, but from what I have read there is not much benefit of having more than 512MB of RAM with the current versions of the Android OS. I would have preferred that they included 1GB of RAM just for a future growth path, but I can understand why they didn't. Each of these features costs money, so you have to draw the line someplace, you can't included every feature people may ask for in every phone.

Your right lol How will the big companies make any money if they give you everything you wanted....
I've never kept a phone for more than 8 months because of updated stats but lately there is a boom in technology so its going to be more like 4 months now hahaha

Related

Clocking Pro's & Cons

Can someone please tell me the pro's and cons on over clocking on BA with WM5. My device certainly picks up in speed and performance when I clock it to 530MHz. Just some what afraid of what damage this might do the device. THX in advance for all your input!!!!
Brief Article
Found this brief but useful article on overclocking.
http://www.pcworld.com/news/article/0,aid,109556,tk,dn022603X,00.asp
well for one thing this article is over 3 years old! Heed it's warning, but don't take it as your only source. A google search for overclocking pda brings up all kinds of articles. Most people have been running their pda processor overclocked for years.
And that bit about a processor being thrown back and run at a slower speed in the same device is a bunch of bull. When a company sells a unit and they say it runs at 400mhz, then processor is in that unit WILL run at 400mhz.
The way the testing procedure actually goes is a company like htc will produce a processor. They will have a target speed and they will test it at all kinds of speeds and determine a reliable speed for it. In many cases the processors will be tested at as much as 30% above what you see them rated at. Then they will go out to different companies who will customize the units. Then those companies will test them at a wide range of speeds and finally settle on what they will rate it at.
There's quite a bit of marketing an foresight that goes hand in hand with these decisions. This is why you will notice things like the same pxa270 processor in the Apahe and the Universal differing by 100mhz.
There was also a pIII 800 mhz processor that could reliably run at 1.2 ghz because it was sold clocked differently over several different models.
In most cases a little bit of research will turn up a "reliable" overclock range on any device.

Motoblur port to Evo?

Not sure if this should be posted in the development section, but I figured its more of a question than anything development related
Does anyone know if it would be possible to port over the new edition of motoblur? And before we all freak out, watch the videos of the new Droid X
(here: http://www.engadget.com/2010/06/15/exclusive-motorola-droid-x-preview/)
Judging by the specs I've seen...
Some of the listed specs include:
4.4-inch, FWVGA 854 x 480-resolution screen
Android 2.1 with "some new sort of Motoblur"
8-megapixel camera
Ability to record 720p video
Multitouch keyboard
1GHz ARMv7 processor
8GB of storage space
Click to expand...
Click to collapse
...The processor is identical; that should help, no? It doesn't seem like too much of a stretch, but I'm no developer.
I just really like the clean, vanilla look of the new motoblur, SenseUI has always seemed a little boring to me. Any thoughts/ideas?
The processors currently both run at 1 ghz, but they are FAR from the same processor. Evo sports a snapdragon from qualcomm while the Droid x has an omap from Texas instruments.
It's a bit premature but I have a feeling that processor is a better candidate for overclocking. Hopefully all those leaked quadrant tests we saw of this phone really were on froyo.
That processor is about the only thing I want from that phone.
Motoblur? No thanks. Not even with a cool name like "ninjablur".
I still say Evo all the way
Sent from my PC36100 using Tapatalk
Yes the Droid X is using ninjablur, a more "adult" version of motoblur. as for a port, i dont even think the old motoblur was ported to anything besides the G1, its bad. Sense > moto/ninjablur by far. ADW or Launcher Pro > blur also
itmustbejj said:
The processors currently both run at 1 ghz, but they are FAR from the same processor. Evo sports a snapdragon from qualcomm while the Droid x has an omap from Texas instruments.
It's a bit premature but I have a feeling that processor is a better candidate for overclocking. Hopefully all those leaked quadrant tests we saw of this phone really were on froyo.
That processor is about the only thing I want from that phone.
Motoblur? No thanks. Not even with a cool name like "ninjablur".
I still say Evo all the way
Sent from my PC36100 using Tapatalk
Click to expand...
Click to collapse
Shows you how much I know, thanks for clearing that up for me. I was under the impression that since they were both "1Ghz ARMv7" processors they were more similar than that.
Don't get me wrong, my point to this post was not to talk down the Evo, it is by far the best device I have ever used, and I find the X to be bulbous and not as clean cut as my Evo. But as much as I love my Evo, I despise SenseUI. Maybe if it was called NINJAUI I would feel differently
I guess I'll just have to wait for a nice AOSP or Froyo ROM.
booyakasha said:
Shows you how much I know, thanks for clearing that up for me. I was under the impression that since they were both "1Ghz ARMv7" processors they were more similar than that.
Don't get me wrong, my point to this post was not to talk down the Evo, it is by far the best device I have ever used, and I find the X to be bulbous and not as clean cut as my Evo. But as much as I love my Evo, I despise SenseUI. Maybe if it was called NINJAUI I would feel differently
I guess I'll just have to wait for a nice AOSP or Froyo ROM.
Click to expand...
Click to collapse
LOL I'm not one to bite people's head off for "questioning the Evo". It's so early into our Evo ownership that we have every right to question our purchase (especially within the 30 day window). As for me I have no questions about the Evo, I haven't had any of the hardware defects lots of other ppl have been reporting.
My only issue with my decision is Sprint's data speeds. Some people get great speeds and I am uber jealous. Indianapolis is the 11th biggest city in the US and my data speeds here blow. I routinely get 150kbps downstream on 3G. With a "premium data" add on you tell me how you would feel paying extra and getting stuck with those speeds....
gah shame on my double posting self.
itmustbejj said:
LOL I'm not one to bite people's head off for "questioning the Evo". It's so early into our Evo ownership that we have every right to question our purchase (especially within the 30 day window). As for me I have no questions about the Evo, I haven't had any of the hardware defects lots of other ppl have been reporting.
My only issue with my decision is Sprint's data speeds. Some people get great speeds and I am uber jealous. Indianapolis is the 11th biggest city in the US and my data speeds here blow. I routinely get 150kbps downstream on 3G. With a "premium data" add on you tell me how you would feel paying extra and getting stuck with those speeds....
Click to expand...
Click to collapse
Didn't notice you were from Indy as well, we are in need of some 4G lovin. I'm not home enough to report on my 3G speeds there but I can tell you it can't be much worse than the 1x I get while at school in Bloomington
I've been lucky as well, I have a little bit of the screen rise but not enough to make a fuss about it yet. I just hate that damn Sense, I stumbled upon Avalaunch's EVOlution ROM, that'll hold me over until a proper 2.2 can be released..then I'll never look back
I'll run some speed tests this weekend when I'm home and let you know how mine looks. I'm up on the NE side.
EDIT: that probably confused you, didn't realize I was logged in on my other account :/
I still am very interested in seeing a motoblur brought to the EVO; im actually trying to work on that but i have no real time

[Q] Kaiser 3D Drivers For Android - SOLVED - CAN BE CLOSED

I was wondering, the 3D performance is a lot better on the kaiser when using the video drivers (of course).
But is there any way to get the 3D driver to work in android?
Since i don't think it's possible to just install the driver on WM and then run Android because it fully shuts down WM i was wondering wether there's a way to get that nice smooth performance on Android as well.
Or is there any app that makes the kaiser a bit faster (graphics wise)?
Thanks a lot!!!
Answer:
3D Drivers are implemented (if that's the correct word (I'm from Belgium so)) in Android.
syntax1993 said:
I was wondering, the 3D performance is a lot better on the kaiser when using the video drivers (of course).
But is there any way to get the 3D driver to work in android?
Since i don't think it's possible to just install the driver on WM and then run Android because it fully shuts down WM i was wondering wether there's a way to get that nice smooth performance on Android as well.
Or is there any app that makes the kaiser a bit faster (graphics wise)?
Thanks a lot!!!
Click to expand...
Click to collapse
Aren't they allready implemented using opengl????
If your running android you already have the drivers, they are in the kernel. I think we need to make a big sticky of that somewhere, third time ive seen it asked this week.
aceoyame said:
If your running android you already have the drivers, they are in the kernel. I think we need to make a big sticky of that somewhere, third time ive seen it asked this week.
Click to expand...
Click to collapse
I second that
syntax1993 said:
Or is there any app that makes the kaiser a bit faster (graphics wise)?
Thanks a lot!!!
Click to expand...
Click to collapse
As has already been said, HW3d is implemented into the kernel which utilises the qcom chip in our kaiser's, but the hardware graphics acceleration although better then nothing is pretty crap compared to the new phones coming along so you can't expect miracles, just be glad it has any at all and android can actually use it unlike window mobile!
scooter1556 said:
As has already been said, HW3d is implemented into the kernel which utilises the qcom chip in our kaiser's, but the hardware graphics acceleration although better then nothing is pretty crap compared to the new phones coming along so you can't expect miracles, just be glad it has any at all and android can actually use it unlike window mobile!
Click to expand...
Click to collapse
I would not call 32 frames per second in NEOCORE, with your build, prertty crap.
Well, can not be compared to new phones, but you must admit that this is more than enough to run 3D games(we can play RagingTunder2!).
Millence said:
I would not call 32 frames per second in NEOCORE, with your build, prertty crap.
Well, can not be compared to new phones, but you must admit that this is more than enough to run 3D games(we can play RagingTunder2!).
Click to expand...
Click to collapse
No, it is quite impressive for an old timer, but obviously it can't keep up with the new hardware on the market and therefore the applications/games that are targeted at these devices. It's also a shame it isn't man enough for new video codecs although installing arcMedia which uses FFMpeg as it's backend improves things a little and gives support for more formats.
Thank you for answering, i had no idea it was integrated into the kernel.
I've heard that the kaiser had a quite good video chip but that's probably compared to the other phones at that time .
Well my phone is running Quake2 at about 10FPS (a bit higher (about 15-20) when looking into corners etc.) and i was hoping for a bit higher rate but it seems it isn't very easy to gain that on android.
Thanks a lot again for this quick answer, i haven't found any post on the forum wich answered my question so...
Syntax1993
Haven't looked well enough then it seems.
I'm sorry.
awhile back I looked at the performance of our integrated 3d and it is about on par with a rage 128 from what I remember... which is pretty bad lol not to mention that on android we have to run through java and we have a pretty weak fpu. I mean in linpack with my barebones rls 3 and overclocked to 572 mhz I get 3.8 mflop/s which is pretty bad lol. That was with JIT working properly even.
aceoyame said:
awhile back I looked at the performance of our integrated 3d and it is about on par with a rage 128 from what I remember... which is pretty bad lol not to mention that on android we have to run through java and we have a pretty weak fpu. I mean in linpack with my barebones rls 3 and overclocked to 572 mhz I get 3.8 mflop/s which is pretty bad lol. That was with JIT working properly even.
Click to expand...
Click to collapse
Have u had any problems clocking that high? I'm a bit scared to clock higher than 450Mhz because i don't want to brick my phone tbh.
Would be cool to clock that high.
*Afraid to clock that high LOL*
syntax1993 said:
Have u had any problems clocking that high? I'm a bit scared to clock higher than 450Mhz because i don't want to brick my phone tbh.
Would be cool to clock that high.
*Afraid to clock that high LOL*
Click to expand...
Click to collapse
And there's the batt consumption issue... even if it can be reached ull need a really really long lasting extended batt, I got a Seidio Inocell 1600mAh, in a donut with .25 kernel and oced to 470, my batt last me like 10-12 hours if I use moderate wifi, bt or gps and keeping my data to 2g only.... If I try to keep my wifi or gps turned on all time and use 3g probably it wouldnt last more than 4-6 hours.
albertorodast2007 said:
And there's the batt consumption issue... even if it can be reached ull need a really really long lasting extended batt, I got a Seidio Inocell 1600mAh, in a donut with .25 kernel and oced to 470, my batt last me like 10-12 hours if I use moderate wifi, bt or gps and keeping my data to 2g only.... If I try to keep my wifi or gps turned on all time and use 3g probably it wouldnt last more than 4-6 hours.
Click to expand...
Click to collapse
Well since the data here costs a lot of money (i don't know what it's like where you live) i rarely use it.
GPS is off and wifi is only on when at home or for a short period of time.
BT is also rarely used and off when not used.
I'm using a 2880Mah battery so the clocking won't be that much of a problem and i can recharge it every night so.
Aren't there high costs for 2G and 3G? it's waay to expensive to have it turned on all day.
Would i get any problems when clocking to around 500Mhz or smthng like that?
syntax1993 said:
Well since the data here costs a lot of money (i don't know what it's like where you live) i rarely use it.
GPS is off and wifi is only on when at home or for a short period of time.
BT is also rarely used and off when not used.
I'm using a 2880Mah battery so the clocking won't be that much of a problem and i can recharge it every night so.
Aren't there high costs for 2G and 3G? it's waay to expensive to have it turned on all day.
Would i get any problems when clocking to around 500Mhz or smthng like that?
Click to expand...
Click to collapse
If ur using a data plan its relatively cheaper... i've oced my lil htc tilt to 520mhz, specialy when trying heavy apps and never had an issue (well never had issues more than the normal ones LOL ) you'll feel it gets a bit "warm" in the backside (maybe due to the higher batt consumption) and speed increase isnt that much beyond a certain point but maybe ull be luckier than me! (this is something common in evert oc! my cousin and i both have an evga gtx 275 i can get ir run almost 100 mhz higher than stock and if my cousin even tries to touch the values his pc hangs up!!) Taking abuot that... The only thing i've never tried was to oc the gpu (i've seen that option in atools) dunno if its doable in our kaisers and if there's a real increase/decrease in performance... if you give it a try maybe you could publish ur results...
albertorodast2007 said:
If ur using a data plan its relatively cheaper... i've oced my lil htc tilt to 520mhz, specialy when trying heavy apps and never had an issue (well never had issues more than the normal ones LOL ) you'll feel it gets a bit "warm" in the backside (maybe due to the higher batt consumption) and speed increase isnt that much beyond a certain point but maybe ull be luckier than me! (this is something common in evert oc! my cousin and i both have an evga gtx 275 i can get ir run almost 100 mhz higher than stock and if my cousin even tries to touch the values his pc hangs up!!) Taking abuot that... The only thing i've never tried was to oc the gpu (i've seen that option in atools) dunno if its doable in our kaisers and if there's a real increase/decrease in performance... if you give it a try maybe you could publish ur results...
Click to expand...
Click to collapse
I'll try clocking CPU to about 500Mhz or a bit more
Going to use it tomorrow because it's about 9pm ATM.
GPU would be nice if it was possible to overclock it.
I'll post it tomorrow, could be that i can't see any result compared to a lower frequency.
Thx for ur help btw!
I've noticed a small diffrence when trying quake2 atm.
The diffrence isn't very great but i will try clocking it a bit higher tomorrow
Yooooo
As i've said before many times there is no risk in overclocking past 528 because we are not touching the voltage going to the cpu when we are overclocking it, just the crystal that controls the frequency it is running at. There is a slight heat increase and loss of battery consequentially because of that extra heat but that is it. I am speaking from experience overclocking a 2.4 ghz celeron E2200 to 4.25 ghz on air cooling with no disatrous results and used it as such every day. Basically I overclocked it with a pinmod for voltage and FSB increases and then I overclocked it further with the motherboard and supplied even more voltage. For the heatsink I used an OEM socket 775 heatsink and had no problems at all, it typically ran about 48 Celsius. If an overclock that high and potentially harmful won't kill the cpu then certainly one of a much smaller % is not going to harm a little kaiser.
FYI I took the voltage on that Celeron to 1.7 volts to get it that high. Stock is 1.1 if I recall
aceoyame said:
As i've said before many times there is no risk in overclocking past 528 because we are not touching the voltage going to the cpu when we are overclocking it, just the crystal that controls the frequency it is running at. There is a slight heat increase and loss of battery consequentially because of that extra heat but that is it. I am speaking from experience overclocking a 2.4 ghz celeron E2200 to 4.25 ghz on air cooling with no disatrous results and used it as such every day. Basically I overclocked it with a pinmod for voltage and FSB increases and then I overclocked it further with the motherboard and supplied even more voltage. For the heatsink I used an OEM socket 775 heatsink and had no problems at all, it typically ran about 48 Celsius. If an overclock that high and potentially harmful won't kill the cpu then certainly one of a much smaller % is not going to harm a little kaiser.
FYI I took the voltage on that Celeron to 1.7 volts to get it that high. Stock is 1.1 if I recall
Click to expand...
Click to collapse
Awsome I could only reach 2.9 gigs on a intel q9400 on an asrock g31m-s (its a really crappy n cheap mobo!) The q9400 runs stock @ 2.66 maybe u can help me with that too lol!! Kiddin...
Sent from my HTC Kaiser using XDA App
albertorodast2007 said:
Awsome I could only reach 2.9 gigs on a intel q9400 on an asrock g31m-s (its a really crappy n cheap mobo!) The q9400 runs stock @ 2.66 maybe u can help me with that too lol!! Kiddin...
Sent from my HTC Kaiser using XDA App
Click to expand...
Click to collapse
See BSEL pin mod for socket 775 as it isnt a gigabyte board that should send your OC soaring through the roof since it doesnt use CPUID for configuring its clock speed on boot.
Millence said:
I would not call 32 frames per second in NEOCORE, with your build, prertty crap.
Well, can not be compared to new phones, but you must admit that this is more than enough to run 3D games(we can play RagingTunder2!).
Click to expand...
Click to collapse
Just tried Raging Thunder 2. Wow - I had no idea games like this could run well on our old machines - except without an accelerometer I can't see how to steer and accelerate at the same time .
Are there any other nice looking action/racing gaems out there that give decent frame rates?

[Q] Dual Core V. Single Core?

So with the new Dual Core phones coming out I'm wondering... What's all the hullabaloo?
I just finished reading the Moto Atrix review from Engadget and it sounds like crap. They said docking to the ridiculously priced webtop accessory was slow as shiz.
Anyone who knows better, please educate me. I'd like to know what is or will be offered that Dual Core will be capable of that our current gen phones will NOT be capable of.
For one thing (my main interest anyway) dual core cpu's and beyond give us better battery life. If we end up having more data intensive apps and Android becomes more powerful multi-core cpu's will help a lot also. Naturally Android will need to be broken down and revamped to utilize multiple cores to their full potential though. At some point I can see Google using more or merging a large part of the desktop linux kernel to help with that process.
At the rate Android (and smart phones in general) is progressing, someday we may see a 64bit OS on a phone, we will definitely need multi-core cpu's then. I know, it's a bit of a dream but it's probably not too elaborate.
KCRic said:
For one thing (my main interest anyway) dual core cpu's and beyond give us better battery life.
Click to expand...
Click to collapse
I'd really, REALLY like to know how you came to that particular conclusion. While a dual core might not eat through quite as much wattage as two single cores, one that takes less is pure snakeoil IMO. I have yet to see a dual core CPU that is rated lower than a comparable single core on the desktop. Why would this be different for phones?
Software and OSes that can handle a dual core CPU need additional CPU cycles to manage the threading this results in, so if anything, dual core CPUs will greatly, GREATLY diminish battery life.
The original posters question is valid. What the heck would one need dual core CPUs in phones for? Personally, I can't think of anything. Running several apps in parallel was a piece of cake way before dual CPUs and more power can easily be obtained through increasing the clock speed.
I'm not saying my parent poster is wrong, but I sure as heck can't imagine the physics behind his statement. So if I'm wrong, someone please enlighten me.
I can see dual cores offering a smoother user experience -- one core could be handling an audio stream while the other is doing phone crap. I don't see how it could improve battery life though....
The theory is that two cores can accomplish the same thing as a single core while only working half as hard, I've seen several articles stating that dual cores will help battery life. Whether that is true I don't know.
Sent from my T-Mobile G2 using XDA App
Kokuyo, while you do have a point about dual cores being overkill in a phone I remember long ago people saying "why would you ever need 2gb of RAM in a PC" or "who could ever fill up a 1tb hard drive."
Thing is wouldnt the apps themselves have to be made to take advantage of dual cores as well?
JBunch1228; The short-term answer is nothing. Same answer as the average joe asking what he needs a quad-core in his desktop for. Right now it seems as much a sales gimmick as anything else, since the only Android ver that can actually make use of it is HC. Kinda like the 4G bandwagon everyone jumped on, all marketing right now.
Personally, I;d like to se what happens with the paradigm the Atrix is bringing out in a year or so. Put linux on a decent sized SSD for the laptop component, and use the handset for processing and communications exclusivley, rather than try and use the 'laptop dock' as nothing more than an external keyboard
As far as battery life, I can see how dual-cores could affect it positively, as a dual core doesnt pull as much power as two individual cores, and, if the chip is running for half as long as a single core would for the same operation, that would give you better batt life. Everyone keep in mind I said *if*. I don't see that happening before Q4, since the OS and apps need to be optimized for it.
My $.02 before depreciation.
Then there are the rumors of mobile quad-cores from Nvidia by Q4 as well. I'll keep my single core Vision, and see whats out there when my contract ends. We may have a whole new world.
KCRic said:
For one thing (my main interest anyway) dual core cpu's and beyond give us better battery life. If we end up having more data intensive apps and Android becomes more powerful multi-core cpu's will help a lot also. Naturally Android will need to be broken down and revamped to utilize multiple cores to their full potential though. At some point I can see Google using more or merging a large part of the desktop linux kernel to help with that process.
Click to expand...
Click to collapse
Wow, that's complete nonsense.
You can't add parts and end up using less power.
Also, Android needs no additional work to support multiple cores. Android runs on the LINUX KERNEL, which is ***THE*** choice for multi-core/multi-processor supercomputers. Android applications run each in their own process, the linux kernel then takes over process swapping. Android applications also are *already* multi-threaded (unless the specific application developer was a total newb).
At the rate Android (and smart phones in general) is progressing, someday we may see a 64bit OS on a phone, we will definitely need multi-core cpu's then. I know, it's a bit of a dream but it's probably not too elaborate.
Click to expand...
Click to collapse
What's the connection? Just because the desktop processor manufacturers went multi-core and 64bit at roughly the same time doesn't mean that the two are even *slightly* related. Use of a 64bit OS on a phone certainly does ***NOT*** somehow require that the processor be multi-core.
dhkr234 said:
Wow, that's complete nonsense.
You can't add parts and end up using less power.
Also, Android needs no additional work to support multiple cores. Android runs on the LINUX KERNEL, which is ***THE*** choice for multi-core/multi-processor supercomputers. Android applications run each in their own process, the linux kernel then takes over process swapping. Android applications also are *already* multi-threaded (unless the specific application developer was a total newb).
What's the connection? Just because the desktop processor manufacturers went multi-core and 64bit at roughly the same time doesn't mean that the two are even *slightly* related. Use of a 64bit OS on a phone certainly does ***NOT*** somehow require that the processor be multi-core.
Click to expand...
Click to collapse
The connection lies in the fact that this is technology we're talking about. It continually advances and does is at a rapid rate. No where in it did I say we'll make that jump 'at the same time'. Linux is not ***THE*** choice for multi-core computers, I use Sabayon but also Win7 seems to do just fine with multiple cores. Android doesn't utilize multi-core processors to their full potential and also uses a modified version of the linux kernel (which does fully support multi-core systems), that's whay I made the statement about merging. Being linux and being based on linux are not the same thing. Think of iOS or OSX - based on linux but tell me, how often do linux instuctions work for a Mac?
"you can't add parts and use less power", the car industry would like you clarify that, along with the computer industry. 10 years ago how much energy did electronics use? Was the speed and power vs. power consumption ratio better than it is today? No? I'll try to give an example that hopefully explains why consumes less power.
Pizza=data
People=processors
Time=heat and power consumption
1 person takes 20 minutes to eat 1 whole pizza while 4 people take only 5 minutes. That one person is going to have to work harder and longer in order to complete the same task as the 4 people. That will use more energy and generate much more heat. Heat, as we know, causes processors to become less efficient which means more energy is wasted at the higher clock cycles and less information processed per cycle.
It's not a very technical explanation of why a true multi-core system uses less power but it will have to do. Maybe ask NVidia too since they stated the Tegra processors are more power efficient.
KCRic said:
The connection lies in the fact that this is technology we're talking about. It continually advances and does is at a rapid rate. No where in it did I say we'll make that jump 'at the same time'. Linux is not ***THE*** choice for multi-core computers, I use Sabayon but also Win7 seems to do just fine with multiple cores.
Click to expand...
Click to collapse
Show me ***ONE*** supercomputer that runs wondoze. I DARE YOU! They don't exist!
Android doesn't utilize multi-core processors to their full potential and also uses a modified version of the linux kernel (which does fully support multi-core systems), that's whay I made the statement about merging. Being linux and being based on linux are not the same thing.
Click to expand...
Click to collapse
??? No, being LINUX and GNU/LINUX are not the same. ANDROID ***IS*** LINUX, but not GNU/LINUX. The kernel is the kernel. The modifications? Have nothing to do with ANYTHING this thread touches on. The kernel is FAR too complex for Android to have caused any drastic changes.
Think of iOS or OSX - based on linux but tell me, how often do linux instuctions work for a Mac?
Click to expand...
Click to collapse
No. Fruitcakes does NOT use LINUX ***AT ALL***. They use MACH. A *TOTALLY DIFFERENT* kernel.
"you can't add parts and use less power", the car industry would like you clarify that, along with the computer industry. 10 years ago how much energy did electronics use? Was the speed and power vs. power consumption ratio better than it is today? No? I'll try to give an example that hopefully explains why consumes less power.
Click to expand...
Click to collapse
Those changes are NOT RELATED to adding cores, but making transistors SMALLER.
Pizza=data
People=processors
Time=heat and power consumption
1 person takes 20 minutes to eat 1 whole pizza while 4 people take only 5 minutes. That one person is going to have to work harder and longer in order to complete the same task as the 4 people. That will use more energy and generate much more heat. Heat, as we know, causes processors to become less efficient which means more energy is wasted at the higher clock cycles and less information processed per cycle.
It's not a very technical explanation of why a true multi-core system uses less power but it will have to do. Maybe ask NVidia too since they stated the Tegra processors are more power efficient.
Click to expand...
Click to collapse
You have come up with a whole lot of nonsense that has ABSOLUTELY NO relation to multiple cores.
Energy consumption is related to CPU TIME.
You take a program that takes 10 minutes of CPU time to execute on a single-core 3GHz processor, split it between TWO otherwise identical cores operating at the SAME FREQUENCY, add in some overhead to split it between two cores, and you have 6 minutes of CPU time on TWO cores, which is 20% *MORE* energy consumed on a dual-core processor.
And you want to know what NVIDIA will say about their bloatchips? It uses less power than *THEIR* older hardware because it has **SMALLER TRANSISTORS** that require less energy.
Don't quite your day job, computer engineering is NOT YOUR FORTE.
dhkr234 said:
Show me ***ONE*** supercomputer that runs wondoze. I DARE YOU! They don't exist!
??? No, being LINUX and GNU/LINUX are not the same. ANDROID ***IS*** LINUX, but not GNU/LINUX. The kernel is the kernel. The modifications? Have nothing to do with ANYTHING this thread touches on. The kernel is FAR too complex for Android to have caused any drastic changes.
No. Fruitcakes does NOT use LINUX ***AT ALL***. They use MACH. A *TOTALLY DIFFERENT* kernel.
Those changes are NOT RELATED to adding cores, but making transistors SMALLER.
You have come up with a whole lot of nonsense that has ABSOLUTELY NO relation to multiple cores.
Energy consumption is related to CPU TIME.
You take a program that takes 10 minutes of CPU time to execute on a single-core 3GHz processor, split it between TWO otherwise identical cores operating at the SAME FREQUENCY, add in some overhead to split it between two cores, and you have 6 minutes of CPU time on TWO cores, which is 20% *MORE* energy consumed on a dual-core processor.
And you want to know what NVIDIA will say about their bloatchips? It uses less power than *THEIR* older hardware because it has **SMALLER TRANSISTORS** that require less energy.
Don't quite your day job, computer engineering is NOT YOUR FORTE.
Click to expand...
Click to collapse
If you think that its just a gimmick or trend then why does every laptop manufacturer use dual core or more and have better battery life than the old single core? Sometimes trends do have more use than aesthetic appeal. Your know-it-all approach is nothing new around here and you're not the only person who works in IT around. Theories are one thing but without any proof when ALL current tech says otherwise... makes you sound like a idiot. Sorry...
I bet I can pee further
Sent from my HTC Vision using XDA App
zaelia said:
I bet I can pee further
Sent from my HTC Vision using XDA App
Click to expand...
Click to collapse
The smaller ones usually can, I think it has to do with the urethra being more narrow as to allow a tighter, further shooting stream.
Sent from my HTC Glacier using XDA App
TJBunch1228 said:
The smaller ones usually can, I think it has to do with the urethra being more narrow as to allow a tighter, further shooting stream.
Sent from my HTC Glacier using XDA App
Click to expand...
Click to collapse
Well, you would know
sino8r said:
Well, you would know
Click to expand...
Click to collapse
It might be short but it sure is skinny.
Sent from my HTC Glacier using XDA App
sino8r said:
If you think that its just a gimmick or trend then why does every laptop manufacturer use dual core or more and have better battery life than the old single core? Sometimes trends do have more use than aesthetic appeal. Your know-it-all approach is nothing new around here and you're not the only person who works in IT around. Theories are one thing but without any proof when ALL current tech says otherwise... makes you sound like a idiot. Sorry...
Click to expand...
Click to collapse
+1
I was comparing speeds on the Atrix compared to the [email protected] and they matched. The Atrix was much more efficient on heat and probably with battery. The dual cores will use less power because the two cores will be better optimized for splitting the tasks and will use half the power running the same process as the single core because the single core runs at the same voltages for a single core compared to splitting it between two. Let's not start a flame war and make personal attacks on people
Sent from my HTC Vision with Habanero FAST 1.1.0
It is disturbing that there are people out there who can't understand this VERY BASIC engineering.
Voltage, by itself, has NO MEANING. You are forgetting about CURRENT. POWER = CURRENT x VOLTAGE.
Battery drain is DIRECTLY PROPORTIONAL to POWER. Not voltage. Double the voltage and half the current, power remains the same.
Dual core does NOT increase battery life. It increases PERFORMANCE by ***DOUBLING*** the physical processing units.
Battery life is increased through MINIATURIZATION and SIMPLIFICATION, which becomes *EXTREMELY* important as you increase the number of physical processing units.
It is the epitome of IGNORANCE to assume that there is some relation when there is not. The use of multiple cores relates to hard physical limitations of the silicon. You can't run the silicon at 18 GHz! Instead of racing for higher frequencies, the new competition is about how much work you can do with the SAME frequency, and the ***EASIEST*** way to do this is to bolt on more cores!
For arguments sake, take a look at a couple of processors;
Athlon II X2 240e / C3.... 45 watt TDP, 45 nm
Athlon II X4 630 / C3.... 95 watt TDP, 45 nm
Same stepping, same frequency (2.8 GHz), same voltage, same size, and the one with twice the cores eats more than twice the power. Wow, imagine that!
The X4 is, of course, FASTER, but not by double.
Now lets look at another pair of processors;
Athlon 64 X2 3800+ / E6.... 89 watt TDP, 90 nm
Athlon II X2 270u / C3.... 25 watt TDP, 45 nm
Different stepping, SAME frequency (2.0 GHz), same number of cores, different voltage, different SIZE, WAY different power consumption. JUST LOOK how much more power the older chip eats!!! 3.56 times as much. Also note that other power management features exist on the C3 that didn't exist on the E6, so the difference in MINIMUM power consumption is much greater.
Conclusion: There is no correlation between a reduction in power consumption and an increase in the number of PPUs. More PPUs = more performance. Reduction in power consumption is related to size, voltage, and other characteristics.
dhkr234 said:
Don't quite your day job, computer engineering is NOT YOUR FORTE.
Click to expand...
Click to collapse
Good job on being a douche. I didn't insult you in anything I said and if you disagree over my perspective then state it otherwise shut up. I didn't tell you english grammar isn't your forte so maybe you should keep your senile remarks to yourself.
You seem to want to argue over a few technicalities and I'll admit, I don't have a PhD in computer engineering but then again I doubt you do either. For the average person to begin to understand the inner-workings of a computer requires you to set aside the technical details and generalize everything. When they read about a Mac, they will see the word Unix which also happens to appear in things written about Linux and would inevitably make a connection about both being based off of the same thing (which they are). In that sense, I'm correct - you're wrong. The average person doesn't differentiate between 'is' and 'based off', most people take them in the same context.
So I may be wrong in some things when you get technical but when you're talking to the average person that thinks the higher the CPU core clock is = the better the processor, you end up being wrong because they won't give a damn about the FSB or anything else. Also, when you start flaming people and jumping them over insignificant things you come off as a complete douche. If I'm wrong on something then tactfully and politely correct me - don't try to act like excerebrose know-it-all. Let's not even mention completely going off track about about Windoze, servers aren't the only things that have multi-core processors.
I'm sure you'll try to multi-quote me with a slew of unintelligent looking, lame comebacks and corrections but in the end you'll just prove my point about the type of person you are. ****The End****
KCRic said:
Good job on being a douche. I didn't insult you in anything I said and if you disagree over my perspective then state it otherwise shut up. I didn't tell you english grammar isn't your forte so maybe you should keep your senile remarks to yourself.
Click to expand...
Click to collapse
Agreeing or disagreeing is pointless when discussing FACTS. Perspective has nothing to do with FACTS. You can think whatever you like, but it doesn't make you right.
You seem to want to argue over a few technicalities and I'll admit, I don't have a PhD in computer engineering but then again I doubt you do either.
Click to expand...
Click to collapse
Common mistake, assuming that everybody is the same as you. Try not to make that assumption again.
For the average person to begin to understand the inner-workings of a computer requires you to set aside the technical details and generalize everything.
Click to expand...
Click to collapse
Generalizations lead to inaccuracies. You do not teach by generalizing, you teach by starting from the bottom and building a foundation of knowledge. Rene Descartes (aka Renatus Cartesius, as in Cartesian geometric system, as in the father of analytical geometry) said that the foundation of all knowledge is that doubting one's own existence is itself proof that there is someone to doubt it -- "Cogito Ergo Sum" -- "I think therefore I am". Everything must begin with this.
When they read about a Mac, they will see the word Unix which also happens to appear in things written about Linux and would inevitably make a connection about both being based off of the same thing (which they are). In that sense, I'm correct - you're wrong. The average person doesn't differentiate between 'is' and 'based off', most people take them in the same context.
Click to expand...
Click to collapse
... and need to be CORRECTED for it. The two kernels (the only components relevant to this discussion) are completely different! MACH is a MICRO kernel, Linux is a MONOLITHIC kernel. Superficial characteristics (which are OUTSIDE of the kernel) be damned, they are NOT the same thing and thinking that they are is invalid. The average person is irrelevant, FACTS are FACTS.
So I may be wrong in some things when you get technical but when you're talking to the average person that thinks the higher the CPU core clock is = the better the processor, you end up being wrong because they won't give a damn about the FSB or anything else.
Click to expand...
Click to collapse
So are you trying to tell me that IGNORANCE is BLISS? Because "giving a damn" or not has NO BEARING on reality. The sky is blue. You think that its purple and don't give a damn, does that make it purple? No, it does not.
Also, when you start flaming people and jumping them over insignificant things you come off as a complete douche. If I'm wrong on something then tactfully and politely correct me - don't try to act like excerebrose know-it-all. Let's not even mention completely going off track about about Windoze, servers aren't the only things that have multi-core processors.
Click to expand...
Click to collapse
Right, servers AREN'T the only thing running multi-core processors, but did you not read where I SPECIFICALLY said **SERVERS**? Wondoze is off track and UNRELATED. I brought up servers because THEY USE THE SAME KERNEL AS ANDROID. If a supercomputer uses Linux, do you not agree that Linux is CLEARLY capable of multiprocessing well enough to meet the needs of a simple phone?
I'm sure you'll try to multi-quote me with a slew of unintelligent looking, lame comebacks and corrections but in the end you'll just prove my point about the type of person you are. ****The End****
Click to expand...
Click to collapse
... perfectionist, intelligent, PATIENT in dealing with ignorance. And understand that ignorance is not an insult when it is true, and contrary to common "belief", does NOT mean stupid. Learn the facts and you will cease to be ignorant of them.
So hopefully this train can be put back on the tracks...
From what I am understanding from more technical minded individuals, Dual Core should help with battery life because it requires less power to run the same things as single core. It can then probably be extrapolated that when pushed, Dual Core will be able to go well above and beyond its Single Core brethren in terms of processing power.
For now, it appears the only obvious benefit will be increased battery life and less drain on the processor due to overworking. Hopefully in the near future more CPU and GPU intensive processes are introduced to the market which will fully utilize the Dual Core's potential in the smartphone world. Thanks for all the insight.
dhkr234 - *slaps air high-five*

3 Reasons stopping me from buying a Galaxy Nexus Please help me change my mind :D

Hey everyone
As seen on this review: http://www.theverge.com/2011/11/17/2568348/galaxy-nexus-review (not sure if it is the final unit) the galaxy nexus is great in terms of both hardware and software but there are three reasons still persisting as to why I shouldn't buy the Galaxy Nexus
1.) The hardware could get obsoleted fairly quickly.
Although (as the review states) the phone is blazing fast, the hardware is only considered great in relation to other phones. (E.g. the Nexus One had the best hardware when compared to the G1 but when the Atrix was announced, it became fairly obsoleted) This could be a problem because right now the phone might have excellent hardware (and software), in a few months when CES 2012 comes along it is rumoured that there will be quad core phones which will greatly surpass the speed of our current-day phones (e.g. http://mobilesyrup.com/2011/11/15/r...ndroid-4-0-with-a-2-5ghz-quad-core-processor/) I know that nothing can substitute for pure vanilla android, the most recent updates from Google and huge developer base but the fact that technological advancements are only becoming more and more prominent and that within a year or so with the introduction of new apps and games, I feel that one year from now, the galaxy nexus might be like the G1 of today. (If anyone has any contradictory reasons, please state them as I really want to purchase a Galaxy Nexus and get rid of my Motorola Milestone (international version of OG droid)
2.) The battery might not suffice for a full day's use.
The only way I can consider my motorola milestone as a viable quality smartphone is if I overclock it to 1GHz (from 550mhz) and apply various tweaks which in turn only let me use the device for 5-7 hours max. If this is the case wit h the Galaxy Nexus, I probably won't want to buy it as I use my phone extensively and I don't want the hassle of charging every night (or at least every 5-7 hours)
3.) This one is not a huge issue for me but it may be in the future. With the gaming market actively expanding in terms of smartphones, with the way-obsoleted GPU that this phone packs, I fear that I may not be able to play a lot of games in the future.
most 'reviewers' are probably not charging the battery when they receive the phone, instead opting to insert the battery right away after opening the box and starting up the phone without charging.
this leads to inconclusive reviews regarding the battery life.
just a thought.
oscillik said:
most 'reviewers' are probably not charging the battery when they receive the phone, instead opting to insert the battery right away after opening the box and starting up the phone without charging.
this leads to inconclusive reviews regarding the battery life.
just a thought.
Click to expand...
Click to collapse
You're right. As stated in the review, they haven't tested the battery extensively, and they said they would update the review with the new battery findings in the future
1) what do you mean could get obsolete?
it is already obsolete except for the 720 HD screen LOL
2) you might be right about that, we'll see how many hour this 1750 mAh battery can pull
3) you also forgot to mention about the lack of space for the ever increasing storage space required to play a game
mohitrocks said:
Hey everyone
As seen on this review: http://www.theverge.com/2011/11/17/2568348/galaxy-nexus-review (not sure if it is the final unit) the galaxy nexus is great in terms of both hardware and software but there are three reasons still persisting as to why I shouldn't buy the Galaxy Nexus
1.) The hardware could get obsoleted fairly quickly.
Although (as the review states) the phone is blazing fast, the hardware is only considered great in relation to other phones. (E.g. the Nexus One had the best hardware when compared to the G1 but when the Atrix was announced, it became fairly obsoleted) This could be a problem because right now the phone might have excellent hardware (and software), in a few months when CES 2012 comes along it is rumoured that there will be quad core phones which will greatly surpass the speed of our current-day phones (e.g. http://mobilesyrup.com/2011/11/15/r...ndroid-4-0-with-a-2-5ghz-quad-core-processor/) I know that nothing can substitute for pure vanilla android, the most recent updates from Google and huge developer base but the fact that technological advancements are only becoming more and more prominent and that within a year or so with the introduction of new apps and games, I feel that one year from now, the galaxy nexus might be like the G1 of today. (If anyone has any contradictory reasons, please state them as I really want to purchase a Galaxy Nexus and get rid of my Motorola Milestone (international version of OG droid)
2.) The battery might not suffice for a full day's use.
The only way I can consider my motorola milestone as a viable quality smartphone is if I overclock it to 1GHz (from 550mhz) and apply various tweaks which in turn only let me use the device for 5-7 hours max. If this is the case wit h the Galaxy Nexus, I probably won't want to buy it as I use my phone extensively and I don't want the hassle of charging every night (or at least every 5-7 hours)
3.) This one is not a huge issue for me but it may be in the future. With the gaming market actively expanding in terms of smartphones, with the way-obsoleted GPU that this phone packs, I fear that I may not be able to play a lot of games in the future.
Click to expand...
Click to collapse
Every phone will be outdated in a year....If thats a big deciding factor u may never get a phone.
Sent From Samsung Vibrant
I have to confirm what they said about the battery life. I charged the phone up fully before even turning it on and with heavy use (you know how it is the first day you get a new phone) I got around 7 hours out of it. With normal usage I can imagine it lasting a full work day but if you are a heavy user you probably want to look into an external or expanded battery.
As for gaming, from my tests some games really don't run very well, although it might be partly due to them not being optimised for the Nexus hardware or ICS.
To be honest though with any modern smartphones 6-8 hours is pretty much what you can expect with heavy use. I doubt you will find anything much better. If gaming is important I suggest you hold off getting a Galaxy Nexus though. Right now it's not looking too good.
Sent from my Galaxy Nexus using XDA App
Chrono_Tata said:
To be honest though with any modern smartphones 6-8 hours is pretty much what you can expect with heavy use. I doubt you will find anything much better. If gaming is important I suggest you hold off getting a Galaxy Nexus though. Right now it's not looking too good.
Sent from my Galaxy Nexus using XDA App
Click to expand...
Click to collapse
probably it just need some optimization
but compared to the SGS2 T989 it sure it's lacking a bit there
the T989 on stock can easily pull 14 hr ~ 18 hr with moderate use, and if you are light use, then you can get over a day with that phone
see here http://forum.xda-developers.com/showthread.php?t=1301609
Issues 1 and 2 apply to pretty much any smartphone you get nowadays. So STi489's statement is quite accurate.
I'll refrain from commenting on #3 because I don't do mobile gaming, so don't really know/care a lot about modern phone GPUs.
1. Every phone is obsolete 3-6 months after its made as technology is always advancing. Its similar to buying a pc if you need to buy one you get the best you can afford at that time. If you want something thats going to be future proof then you will never get anything as its just not possible. You can hold out for a quad core phone if you want but they wont start appearing for 3-6 months at least and then if you get one of those an 8-core phone maybe 9 months away from that so what are you going to do? The Galaxy Nexus can handle everything that will be thrown at it right now and it will always get the latest updates as soon as they are released by google which I think is more important than the speed of the phone.
2. As mentioned many times the battery tests are not thorough enough to be of any value but with any modern smartphone you arent going to get much more than 8-10 hours intensive use which should be more than enough to get you through a day without problems. If you think you are going to get a phone with an HD screen and dual or quad core processor to last 2-3 days you are dreaming. Leave it in standby most of the time and use it sparingly then yes but with moderate use a day is quite sufficient. You can buy 4800mAh backup chargers for about £20 which is what im getting instead of another battery and should be able to recharge your phone 2 or 3 times.
3. Im pretty sure there wont be many games in the next year that require a quad core phone as they wont have much of a target market. Dual core phones will easily be able to handle any game thrown at it for the foreseable future and it can easily be clocked to at LEAST 1.5GHz which is what its designed to run at so if you need extra speed you can get it. The iphone 4s only runs at 800MHz and look what that can do.
Mark.
mskip said:
1. Every phone is obsolete 3-6 months after its made as technology is always advancing. Its similar to buying a pc if you need to buy one you get the best you can afford at that time. If you want something thats going to be future proof then you will never get anything as its just not possible. You can hold out for a quad core phone if you want but they wont start appearing for 3-6 months at least and then if you get one of those an 8-core phone maybe 9 months away from that so what are you going to do? The Galaxy Nexus can handle everything that will be thrown at it right now and it will always get the latest updates as soon as they are released by google which I think is more important than the speed of the phone.
2. As mentioned many times the battery tests are not thorough enough to be of any value but with any modern smartphone you arent going to get much more than 8-10 hours intensive use which should be more than enough to get you through a day without problems. If you think you are going to get a phone with an HD screen and dual or quad core processor to last 2-3 days you are dreaming. Leave it in standby most of the time and use it sparingly then yes but with moderate use a day is quite sufficient. You can buy 4800mAh backup chargers for about £20 which is what im getting instead of another battery and should be able to recharge your phone 2 or 3 times.
3. Im pretty sure there wont be many games in the next year that require a quad core phone as they wont have much of a target market. Dual core phones will easily be able to handle any game thrown at it for the foreseable future and it can easily be clocked to at LEAST 1.5GHz which is what its designed to run at so if you need extra speed you can get it. The iphone 4s only runs at 800MHz and look what that can do.
Mark.
Click to expand...
Click to collapse
Mostly I agree with you, but on point 3 you mixed something up. iPhone 4s has a very capable (if not the best released) dual-core GPU, the PowerVR SGX 543MP2, while GN is using a overclocked PowerVR SGX 540. It is still more than capable though, but you're mixing up CPU with GPU in your post. Just wanted to clarify that.
Cheers
qwer23
qwer23 said:
Mostly I agree with you, but on point 3 you mixed something up. iPhone 4s has a very capable (if not the best released) dual-core GPU, the PowerVR SGX 543MP2, while GN is using a overclocked PowerVR SGX 540. It is still more than capable though, but you're mixing up CPU with GPU in your post. Just wanted to clarify that.
Cheers
qwer23
Click to expand...
Click to collapse
Point taken I was watching tv at the time while I was typing and wasnt thinking too much lol .
Mark.
the Gnex comes with a 1750mAh
I suppose it would last me 2-3 days (maybe i'm wrong) and I'm a light user
my friend's SGS2 lasts 2-3 days to him with light to moderate usage BUT not always connected
and I hope on what I assume will be true and hope the Gnex's battery life is better than my iP4 so it would be really an upgrade for me as I am not also connected to wifi and stay with 2G only..
I just hope Samsung will release an official extended battery (3000ish) with a kick stand (like those for HD2). It would be great.
I'm just concerned about how the GPU will cope in playing games. Otherwise, I can't wait until Sat/Sun for T-Mobile to get this in stock.
soullinker20 said:
the Gnex comes with a 1750mAh
I suppose it would last me 2-3 days (maybe i'm wrong) and I'm a light user
my friend's SGS2 lasts 2-3 days to him with light to moderate usage BUT not always connected
and I hope on what I assume will be true and hope the Gnex's battery life is better than my iP4 so it would be really an upgrade for me as I am not also connected to wifi and stay with 2G only..
Click to expand...
Click to collapse
With WIFI off and using only 2G I think the battery life would be very good. Constant updates to social networking sites can be a real battery killer on todays phone.
luffyp said:
I just hope Samsung will release an official extended battery (3000ish) with a kick stand (like those for HD2). It would be great.
Click to expand...
Click to collapse
It would indeed be good but extremely unlikely
Mark.
mskip said:
With WIFI off and using only 2G I think the battery life would be very good. Constant updates to social networking sites can be a real battery killer on todays phone.
Mark.
Click to expand...
Click to collapse
thank you!
this would be a worthy upgrade from my iP4 imo. I'm prepared to switch to Android now
but I still have to wait 2-3 weeks before this phone arrives here @ the Philippines.
Thank you all so much for the replies!
I do believe I have changed my mind
1.) Yeah, even though the phone will be obsoleted in 3-6 months, all phones will likely to be obsoleted at a time close to their initial release.
2.) I don't mind buying another external battery for my phone, guess I never thought about it (Mark, what do you mean by: "You can buy 4800mAh backup chargers for about £20 which is what im getting instead of another battery and should be able to recharge your phone 2 or 3 times." Is this a charger than can supply battery on the go? Because I go to high school [yeah im 15 years old] and having an extra battery or charger works for me)
3.)I barely game anyways, and there are plenty of games that can still run on the gpu.
soullinker20 said:
thank you!
this would be a worthy upgrade from my iP4 imo. I'm prepared to switch to Android now
but I still have to wait 2-3 weeks before this phone arrives here @ the Philippines.
Click to expand...
Click to collapse
Well at least that will give you time to read reviews from people who are using the phone and see how its performing in the real world before you order it.
Mark.
mohitrocks said:
Thank you all so much for the replies!
I do believe I have changed my mind
1.) Yeah, even though the phone will be obsoleted in 3-6 months, all phones will likely to be obsoleted at a time close to their initial release.
2.) I don't mind buying another external battery for my phone, guess I never thought about it (Mark, what do you mean by: "You can buy 4800mAh backup chargers for about £20 which is what im getting instead of another battery and should be able to recharge your phone 2 or 3 times." Is this a charger than can supply battery on the go? Because I go to high school [yeah im 15 years old] and having an extra battery or charger works for me)
3.)I barely game anyways, and there are plenty of games that can still run on the gpu.
Click to expand...
Click to collapse
Something like *THIS*. Its basically a huge capacity rechargeable battery that you can plug any usb device into to charge it. In my opinion more practical than buying replacement batterys as you dont have to turn the phone off to keep going. It could even be used as a rechargeable led torch lol..
Im sure you could find one that can be delivered to your location with a google search.
Mark.
1. It only need optimizations. and Ice Cream Sandwich is the start of the long awaited hardware optimization that Android needs.
2. User dependent. and Galaxy Nexus has that feature that any iphone doesn't have. A replaceable battery. So you could bring an extra battery for emergencies. But i really do think that 1750mah can last you a day
3. GPU is still very capable to handle games. Again, just like no.1 all it needs is optimizations. Besides, i don't see game developers focusing on making games that can only run on a very very very powerful GPU.

Categories

Resources