pixel 3 gps accuracy / chipset - Google Pixel 3 XL Questions & Answers

hey all,
does anyone know if the pixel 3 will have that new gps chip that will be so accurate within 30cm or so. i read this somewhere a few months back that we can expect high end smartphones to have this new chipset. i am sorry i dont know what it is called but i think it was a broadcom chip? or perhaps qualcomms new gps chips are also that accurate!

registeredxdadevi said:
hey all,
does anyone know if the pixel 3 will have that new gps chip that will be so accurate within 30cm or so. i read this somewhere a few months back that we can expect high end smartphones to have this new chipset. i am sorry i dont know what it is called but i think it was a broadcom chip? or perhaps qualcomms new gps chips are also that accurate!
Click to expand...
Click to collapse
I hope not, I rely on GPS drift when playing Pokemon GO. I can be sitting at the local pub enjoying a pint, when my trainer is across the street at a Gym (Where I can battle or join a raid.)

My Pixel 3 gets a GPS lock very quickly and stays on 3m accuracy so although it isn't getting sub-metre accuracy it is super quick. Also, I got a good GPS lock while flying in a commercial plane at 11,000m at 670km/h. The photos that I took on the plane were all geotagged.

Related

Gramin XT

I dont know if you guys know already or if its been talked about already but Garmin XT works perfect on the Touch HD! I pick up a gps signal as soon as i start the program, even in my house.
I have read around that Garmin Mobile XT has a problem with battery drain.
What can you tell us about that?
all GPS Sat/Nav programs will drain the battery very quickly... i use both Garmin and iGO and, when i'm not using my power cord, my experience is that they zap the battery at the same rate.
cortez.i said:
all GPS Sat/Nav programs will drain the battery very quickly... i use both Garmin and iGO and, when i'm not using my power cord, my experience is that they zap the battery at the same rate.
Click to expand...
Click to collapse
Which one do you prefer and why?
I have not used the gps long enough to see the battery drain but i wouldn't doubt it.
As for igo and garmin I like garmin alot better but thats just my opinion.
I like garmin as well. Just trying to find a more attractive voice to listen to
guys would you say that there is considerable lag when using Garmin XT?
what I am experiencing is for example while driving, and then taking
a right turn, garmin doesn't update it until 2 seconds later, thus throwing me
off track and then on track again.
If you have tracking or (foot prtins) active and recording you can see this
clearly with sharp 90 degree snaps as well as strange snaps that do not follow any logic.
Changing the 3d-view to 2d-view (North always up) helps to minimize this.
So i am guessing is this telling me that the 528Mhz cpu is not enough!!!
Fadik said:
guys would you say that there is considerable lag when using Garmin XT?
what I am experiencing is for example while driving, and then taking
a right turn, garmin doesn't update it until 2 seconds later, thus throwing me
off track and then on track again.
If you have tracking or (foot prtins) active and recording you can see this
clearly with sharp 90 degree snaps as well as strange snaps that do not follow any logic.
Changing the 3d-view to 2d-view (North always up) helps to minimize this.
So i am guessing is this telling me that the 528Mhz cpu is not enough!!!
Click to expand...
Click to collapse
I just took my HD and Garmin MXT on a trip to Quebec for New Years. I can report that there is a graphical "lag" but the naviagation prompts are right on the money. I did not use the track feature so I cannot comment on the way it lays down its "bread crumbs" but I did notice a huge difference in time when it recalculated compared to my Touch (likely the 201 Mhz vs. 528 Mhz processors). I have never used any other software but Garmin so I can't compare but I like it alot. Oh, I also remember their was no delay with my Touch using the same SD card and a blue tooth GPS receiver.
dhatw
Yes, there's is sometimes a lag. I also recognized this. If you also track your way the battery went down fast. I can track with a full battery about 6-7 hours.
ive using the HD and Garmin XT.
i like it a lot..
i compared it to the Nuvi 200 and the Nuvi 265wt.. they are all at about the same moment but none were exactly the same. but i do notice that the HD is laggy when it comes to the graphic section compared to the Nuvis.
anyone find a way to improve it?
the Garmin XT doens't say street names does it?
or does it?
tyronne1126 said:
ive using the HD and Garmin XT.
the Garmin XT doens't say street names does it?
or does it?
Click to expand...
Click to collapse
No, I didn't find something also.
I see street names on mine... Mine also doesn't lag at all.
Yes, you can set Garmin that street names will be shown on the screen. But there's no option that you hear the street names while navigating.
Don't knoe if other programs have this option. But it must be impossible because you can't have voice files of all streets in europe...
I think it is a processor limitation, the nüvi's have a TTS (Text-To-Speech) system so that the device does not need all the street names but can speak anything, it likely has a dedicated processor for this, the HD has to do it all with one processor and since it seems to already have problems with just navigation I think we'll have to wait for the next generation phones to have enough processing power.
Personally I don't really like the spoken street names, I want English prompts so most of them are mispronounced anyway, it's more of an entertainment option to laugh at when it tries to pronounce things like Onze-Lieve-Heersbeestjeslaan
Error-deleted
Turb0wned said:
I see street names on mine... Mine also doesn't lag at all.
Click to expand...
Click to collapse
Do you have version 5.00.20w? I have a lag only when zoomed in alot. I noticed when zoomed out it is hardly noticeable. Also, I can only get the upcomming street names to appear at the top of the screen when "not navigating" by doing a round about. That is to say it does not display the upcomming street names by itself the way version 4.00... did.
I have version
4.10.00wp
garmin xt or tomtom
Hello all. I tried the garmin xt but did not like it. its the same as the zumo thats on my kawasaki z1000 and after riding around spain and back through france I realised how bad it is compard to tomtom navigator. after using tomtom for a couple of years now on my p3300 (orbit) I got a hold of tomtom navigator 7 which works a treat. As well al know its personal preference.
i have the newest one.. with newest maps..
i know the Garmin Nuvi 265wt says streeet names.. i guess they didn't build it into the mobile one because of memory or the speed ..
mine always lags.. the car is never next to the intersection .. or in a turn.. maybe ill try to zoom it out a bit. its on automatic..
Added this software to the wiki page
http://wiki.xda-developers.com/index.php?pagename=HTC_Blackstone_GPS
Maybe somebody would update that page with the details of this thread.

Htc hd7 (hd3 3)

found this on gsm arena on the table of phones is the HTC HD3 name HTC HD7 maybe a windows 7 phone ??
http://blog.gsmarena.com/htc-hd7-to-launch-in-october-it’s-a-renamed-hd3/
found this on pocketnow
http://pocketnow.com/windows-phone/htc-hd3-hd7-rumored-again-features-amazing-spec-sheet
Nice looks great i also hope the desire HD has I 1.5ghz dragon inside
http://forum.xda-developers.com/showthread.php?t=771580
Almost too good to be true... I'd love to have a WXGA (1280x720) screen resolution!
Do you guys really think of dual-core arm 1,5 ghz cpu?
ne0cr0n said:
found this on pocketnow
http://pocketnow.com/windows-phone/htc-hd3-hd7-rumored-again-features-amazing-spec-sheet
Click to expand...
Click to collapse
fake
10char
-=n3rd=- said:
Do you guys really think of dual-core arm 1,5 ghz cpu?
Click to expand...
Click to collapse
Old news: http://www.phonearena.com/htmls/Sna...bile-CPU-war-beyond-1GHz-article-a_12546.html
So, where is it? Can't be that it hasn't been leaked if the release is so soon.
I spotted this a few days ago as well... I am not sure what to make of it as the info for it is rather scarce and tbh anyone can make a spreadsheet like that... and say it is an official "leaked" document from HTC...
I remain skeptic on this issue. If anyone can confirm this to be true, I will personally make an article for the front page for it.
4.5 inches? I'm not too sure if I would go for something that big.
3.8", yes. But don't people find 4.3" and up to be a little bit too big? Its like a mini-tablet.
Also, I was reading that article on the CPUs. Will you need a graphics chip in your phone in order to play high-quality 3D games or will a 1.2 ghz core 2 duo be able to handle that?
My plan expires this winter, so I am looking almost frantically for the perfect phone with dual cores and awesome performance and all that . (best time for my plan to expire)
4.3 is fine. Not sure about "and up". 3.5 is too small.
theomni said:
4.5 inches? I'm not too sure if I would go for something that big.
3.8", yes. But don't people find 4.3" and up to be a little bit too big? Its like a mini-tablet.
Also, I was reading that article on the CPUs. Will you need a graphics chip in your phone in order to play high-quality 3D games or will a 1.2 ghz core 2 duo be able to handle that?
My plan expires this winter, so I am looking almost frantically for the perfect phone with dual cores and awesome performance and all that . (best time for my plan to expire)
Click to expand...
Click to collapse
The bigger the better - I think I can go as high as 5". I've actually been looking at tablets myself. Right now I'm trying it all on for size: I have a travel laptop, an alienware laptop (it's not what I would call airplane portable, but I can take my desktop with me if I want to, and I do want to ), an iPad, and my phone.
Ultimately, I'd like it to be the phone and the mongo-laptop - so a bigger screen would be better. There are just some things / situations that lend themselves better to each of the 4 devices as they are right now.
As for the GPU - it's already in the 1.2 dual core chips (they are a System On a Chip - SOC). And yes, I think it's needed for the best 3D rendering but I'm not an expert
I'd like to have it.. only if..
Hmm.. spotted this in Google when looking for Win Phone 7 info.
Definitely, the spec is too good to be true.
One thing is that *no* GPS was mentioned in any of the posts..
But if I have to choose, I'd like to have GPS over HDMI..
Also, it seems Win Phone 7 still do not include multi-language input..
That means if I buy this phone, I'll have to search the Google again for JPN+Korean+Chinese input again, not to mention that it is highly possible to hard-reset the mobile for million times
I believe that 1800 mAh is not enough. HD3 will need to charge two times daily.
Cant beleive people here are falling for this BS:
http://nak-design.over-blog.fr/pages/HTC_HD3_by_NAK_-2727167.html
efjay said:
Cant beleive people here are falling for this BS:
http://nak-design.over-blog.fr/pages/HTC_HD3_by_NAK_-2727167.html
Click to expand...
Click to collapse
Only some - that has been floating around since, what, February this year? Something like that
ne0cr0n said:
found this on pocketnow
http://pocketnow.com/windows-phone/htc-hd3-hd7-rumored-again-features-amazing-spec-sheet
Click to expand...
Click to collapse
It's a french blogger that made that more than 6 months ago.....
Worse still - news yesterday was that even though Qualcom is shipping the 1.5 GHz dual core chips later this year, they wont make it into devices until mid to late next year. So, you can pretty much bet that the HD7 will not be 1.5GHz, if it is released this year.
eknutson said:
Worse still - news yesterday was that even though Qualcom is shipping the 1.5 GHz dual core chips later this year, they wont make it into devices until mid to late next year. So, you can pretty much bet that the HD7 will not be 1.5GHz, if it is released this year.
Click to expand...
Click to collapse
For those of us with HD2s, there is little motivation to move to a WP7 then. The 1.5ghz units will probably sport a better version of WP7 anyways.

FYI, the T-Mobile Galaxy S II T989 supports GLONASS for improved GPS

According to Anandtech:
Until recently basically all satellite based location has used GPS, which consists of a constellation of 27 satellites. GLONASS is its Russian cousin, with 24 satellites. The two can be used in conjunction to get faster 3D fixes, better coverage, and greater accuracy. We'll see more and more GNSS solutions start shipping in 2012 as well.
...
If you fire up GPS Test, satellites numbered 65-88 are GLONASS. Qualcomm’s GNSS only uses GLONASS when there either aren’t enough visible GPS satellites for a fix, or SNR is bad (basically an urban canyon or indoors scenario), so you’ll often see a set of bars pop in or out depending on how good the normal GPS SNR is. This implementation is the same for all of the Qualcomm SoCs and MDMs that include GNSS support.
Click to expand...
Click to collapse
I just tried out GPS Test on my phone, and it picked up the GLONASS satellites as well!
Additional confirmation: http://androidandme.com/2011/12/news/qualcomm-enables-dual-core-location-on-snapdragon-phones/
Just one more way our phones are the best phone on the market.
Yea, it was announced a while ago: http://forum.xda-developers.com/showthread.php?t=1393727
It's pretty cool!

[Q] Adding WAAS feature to GPS

I was looking through the BOM for the Galaxy Nexus, and apparently it uses the SiRFstarIV device for GPS.
That device (like any modern GPS chipset) supports WAAS/EGNOS, which offers a nice increase in precision. My Garmin handhelds will eventually tune into satellites 46, 48 or 51 which supply the data used for the WAAS correction. They're in a geostationary orbit over the western hemisphere, so they are always present in the sky.
But I never see those birds on any Android device, and no GPS app I have seen has advertized WAAS. Why not? The hardware supports the feature!
I was hoping to answer this question myself by looking at the source code; but I have no experience with large software projects (or Git) and I haven't been able to find a "driver" yet by browsing source repositories. Would anyone be willing to "hold my hand" a little to guide me towards the relevant sources?

Development [GS101|GS201] Google Tensor G1 and G2 In-Depth

Hello everyone,
This thread will be used as a hub where I share some discoveries/observations which I stumbled upon mostly during working on my kernel projects.
I´ll clone the same thread over to the Pixel 7 forum as well. So without much further ado let´s just dive right into it.
A year ago everyone was excited for the Google SoC called Tensor 1 called GS101. One year later there is Tensor 2 called GS201.
I suggest to read about the differences, updated modem, ISP, TPU and GPU in various tech related articles.
Here´s a table so everyone gets up to speed on cores used, max freqs and other details:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
But how does that translate on the devices?
There were quite a few rumors before the actual release of the Tensor 2 SoC being manufactured on 4nm Samsung node instead of 5nm. However that was just wild speculation and unfortunately turned out to be not accurate. Tensor G2 is still manufactured in the 5nm process as confirmed by Google. This was quite a negative surprise to myself, as I don´t have good experiences from SD888 that´s also being manufactured in Samsung 5nm node and is quite a hot chip. While the switch to Samsung 4nm node, wasn´t all that great either (check sd8 gen1 on samsung 4nm vs sd8+ gen1 on tsmc 4nm) it would still have been an improvement.
While I was very excited for the Tensor G1 when the Pixel 6 devices launched, that excitement ebbed down the work I worked on the Pixel 6 series. The more I learned about the source, the more I stumbled upon Exynos driver over exynos driver, some are just left exactly like on Exynos device, some were "re-branded" by Google. Some Google did customize, but most of the drivers are just very much Exynos.
So all in all the following excerpt from Andrei Frumusanu´s article here sums it up pretty fitting:
While Google doesn’t appear to want to talk about it, the chip very clearly has provenance as a collaboration between Google and Samsung, and has a large amount of its roots in Samsung Exynos SoC architectures.
Click to expand...
Click to collapse
The same is true for the Tensor 2, despite minor upgrades there. As I learned over the time, Tensor shares a lot of Exynos characteristics, one of those is performance vs thermals as hinted by in the linked Anandtech article. So let´s just jump into that first topic I want to cover.
I will cover more topics, those will also be probably interconnected to each other, but we have to start somewhere.
Thermals, Thermal Ceiling, Exynos Roots and Maximal Performance
To start things of: Thermals is a term that actually sums up a few mechanisms. Lets split this into two main areas.
The thermal ceiling (let´s call it that) that´s being implemented in the kernel, as the maximal temperature the SoC is allowed to be operated at.
The thermal-hal uses combined sensors, also virtual sensor, and restricts different subsystems, based on the temperature of those sensor. Those can be called skin temps, shell-temps, battery temperature, modem temperature etc.
First let´s explore the thermal ceiling on the two SoCs:
GS101 on Pixel 6 devices is allowed to be operated at 90°C. GS 201 on Pixel 7 devices raised the thermal ceiling by 10°C to 100°C.
If changes to the internal design allowed them to raise this, without further increasing heatup of the device, or if they just applied changes to the thermal-hal to better keep this in check I don´t know at this moment.
Let´s get back to the Exynos characterstics. I talked to a few other developers I met along that way with Exynos "experience". Exynos SoCs reach the thermal ceiling extremely quickly, as I learned. This means, the SoC can´t keep its max CPU freqs for more than a few seconds without touching the thermal ceiling and getting restricted. This is in a way also the case for other SoCs, but Exynos is very extreme in this regard. But it´s just the characteristics of the SoC, like previously mentioned.
That means in turn: The thermal ceiling is setting the maximal performance allowed, to a great extent. If the thermal ceiling is raised, the maximum performance can be held longer.
Here´s a demonstration of this:
Pixel 6 Pro in its default configuration running at 90°C temp ceiling:
New video by freak 07
photos.app.goo.gl
Pixel 6 Pro with temperature ceiling raised to 100°C, instead of 90°C running at Pixel 7 Pro clocks
New video by freak 07
photos.app.goo.gl
Pixel 7 Pro, with the default configuration of 100°C
New video by freak 07
photos.app.goo.gl
Now what´s interesting, the big cores get the hottest at the quickest rate. Once the ceiling is reached, the max performance drops, as maximal performance will be restricted by restrict max cpu freqs.
That´s the case after a few seconds on both SoCs, in typcial Exynos fashion.
Let´s make the next connection:
Although I´m not necessarily a friend of benchmark apps, how does that change the results of a CPU oriented benchmark like Geekbench you might ask yourself. There are other benchmarks, but I want to keep this simple for now.
The answer is: The Pixel 6 Pro with GS101 gets pretty close to the results of the Pixel 7 Pro with GS201.
So for comparisons sake:
On the left Pixel 6 Pro in its default configuration running at 90°C temp ceiling.
On the right Pixel 6 Pro with temperature ceiling raised to 100°C, instead of 90°C running at Pixel 7 Pro clocks.
The kernel used was the same, no changes to anything that could impact performance.
The left screenshot above was taken from the Pixel 7 Pro review from XDA, while the right one was taken on my Pixel 7 Pro running my kernel.
Please don´t start benchmark contests now, It´s just for comparisons sake.
It makes sense for single-core to be less impacted, as single core benchmarks don´t put as much thermal pressure on the SoC -> not touching the thermal ceiling as much and therefore no cutback are applied.
Geekbench applies a series of short benchmarks to the device. Usually not longer than 3-8 seconds, which is ideal for a SoC like the Tensor. Short bursts with max performance, so it can run "nearly" without touching the thermal ceiling.
If a benchmark is structured differently, like the CPU stress test you will see QCOM SoCs holding their max-freqs for minutes, instead of seconds.
Well that´s the first part. More to come. I hope everyone enjoyed this little writeup so far.
I wish everyone a nice evening.
Thermal Ceiling/Maximal Performance - A comparison between QCOM Snapdragon and Tensor
Now you might ask yourself, how does QCOM´s Snapdragon behave in the little test we conducted above.
You can find the answer below.
For this a Zenfone 9 with the Snapdragon 8 + Gen 1 is used.
Pixel 6 Pro in its default configuration running at 90°C temp ceiling:
New video by freak 07
photos.app.goo.gl
Pixel 7 Pro, with the default configuration of 100°C:
New video by freak 07
photos.app.goo.gl
Asus Zenfone 9, with the default configuration of 110°C temp ceiling:
New video by freak 07
photos.app.goo.gl
As you can see the Zenfone 9 with SD 8+ Gen 1 can hold the max-freqs for minutes. It doesn´t touch the thermal ceiling when running under max load for a minute, while Tensor immediately scratches the ceiling.
I´m not a SoC expert and I think only engineers with insider knowledge know the exact reason why Exynos based SoCs behave that way. They just seem to work totally different in that regard.
Another point is, since the SoCs are different we can´t compare the temperatures one to one. There´s no way for me to know the exact placement of the temperature sensors, all I can say for sure is the SD 8+ Gen 1, does not touch the thermal ceiling in this test and there seems to be a lot of headway after one minute of maxed out CPU.
In the end the result will be the same. After a while the device will heat up and the thermal-hal will throttle the ZF9 back as well, as with only passive cooling that´s inevitable.
this one is mine too
this one too
number 5
and the last one
Interesting, thanks for this explanation and comparison. Learned something new today.
so tensor is just Exynos but rebranded and more ai performance
w_tapper said:
so tensor is just Exynos but rebranded and more ai performance
Click to expand...
Click to collapse
I´m not sure if it´s that easy in the end. It´s not just simply rebranded, there´s a lot of custom stuff in it, but at the core it´s based on the Exynos that´s clear.
I guess it´s just not possible to come up with a real custom chip in only a few years. It´s probably the beginning of that.
A few thoughts on tensor, that already allow google to better tune software to hardware which results in real life benefits:
The 4+2+2 design. That´s unique to the tensor and greatly favors those short performance boosts that are really critical in everyday usage. Everyone is constantly opening apps, if those apps launch a fraction faster, that´s a good real world benefit. (typical app launch times are between 0.2s and 0.7s.)
Tensor is not lacking behind the competition, let´s take the SD8+ Gen1, at all in that regard as just looking at benchmark scores would suggest.
I guess that´s a good section for another in depth post, that investigates real world usage.
The other is the TPU and the ISP.
Last year I used a Sony Xperia 1 III with SD888 (Sony more or less uses very relaxed thermal and basically unleashes full perf of SD888) vs the Pixel 6 Pro to edit the same video file and cut it.
The Pixel 6 Pro finished the task quite a bit faster. If I recall around 10-15 second faster.
There are other examples, but this will do for now.
updated the second post, with a comparison between QCOM´s Snapdragon and Google´s Tensor
Interesting since I wonder if a switch from P6P to P7P is worth it.
Utini said:
Interesting since I wonder if a switch from P6P to P7P is worth it.
Click to expand...
Click to collapse
I like my P7P even better than the P6P, although I had no particular issues with the P6P. The USB 3.2 Gen 2 versus Gen 1 USB-C port pushed me over the edge.
Utini said:
Interesting since I wonder if a switch from P6P to P7P is worth it.
Click to expand...
Click to collapse
In the end you have to decide yourself.
I can give maybe a few experiences. I think there´s not a single real regression going from p6pro to p7pro for me. That´s an important point, considering that´s not always the case in such upgrade scenarios.
I like the slightly lower placement of the buttons, they´re easier to reach for me. The FP might be a tad faster, but I was one of the lucky ones that never really had trouble on the p6pro. I like the face unlock.
What I really enjoy is the macro mode in GCAM. It works quite well and fills a gap.
While I never had any real problems with mobile network on the p6pro there are three areas I regularly drive through. All phones struggle there in a way. Some lose signal there, amongst them the p6pro, other can barely keep it. Though you never know with latency and "cheating" with the signal icons. Each OEM handles that differently.
However usually I notice I can´t call anyone there and spotify playback gets interrupted due to signal loss on p6pro. With the p7pro I´m able to make and receive calls in those areas and also no other kind of signal loss. If that´s the case for everyone or just a border case for me I don´t know. It´s just an observation.
There are lots of small improvements that add up. Is it worth it for you, you have to decide.
Freak07 said:
In the end you have to decide yourself.
I can give maybe a few experiences. I think there´s not a single real regression going from p6pro to p7pro for me. That´s an important point, considering that´s not always the case in such upgrade scenarios.
I like the slightly lower placement of the buttons, they´re easier to reach for me. The FP might be a tad faster, but I was one of the lucky ones that never really had trouble on the p6pro. I like the face unlock.
What I really enjoy is the macro mode in GCAM. It works quite well and fills a gap.
While I never had any real problems with mobile network on the p6pro there are three areas I regularly drive through. All phones struggle there in a way. Some lose signal there, amongst them the p6pro, other can barely keep it. Though you never know with latency and "cheating" with the signal icons. Each OEM handles that differently.
However usually I notice I can´t call anyone there and spotify playback gets interrupted due to signal loss on p6pro. With the p7pro I´m able to make and receive calls in those areas and also no other kind of signal loss. If that´s the case for everyone or just a border case for me I don´t know. It´s just an observation.
There are lots of small improvements that add up. Is it worth it for you, you have to decide.
Click to expand...
Click to collapse
Ye I have read about the new "features". I loved face unlock on the 4XL and all the other new changes on the P7P are very welcome as well. But those aren't enough for me atm to switch to a new device. Especially since I first want thermals (e.g. 4k 60 fps video recording or while using the phone in the summer), radio signal and battery to be improved.
Radio signal seems to be better for now. My problems with the P6P seem to be the same that you have/had
Maybe that also helps slighty with battery, especially in dual-sim mode.
But I guess ultimately I will wait for the P7P and hope that it is going to be a "bigger" jump. The P6P is really fine for a phone. Jesus, I would even be fine with the P4XL.
Nice research and information. Was very interested in taking a deeper dive into the core differences between the G1 and G2. Thanks for sharing and laying this all out in a way that makes sense. Interested to see what else you can uncover!
Wait ... how did you change the temperature control threshold? I have been working on this. After the surface temperature of the phone exceeds 39 ° C, the operating frequency of Tensor will drop sharply, making the game experience very bad ... I think there is something like Magisk module to increase temperature control? Translate form Google.

Categories

Resources