[GUIDE] How-To: 4K GameStream without 4K computer monitor(s) - Shield Android TV General

So as you guys know the Shield TV is a solid 4K device sporting support for [email protected] over HDMI 2.0.
nVidia gamestream works by playing the game on your PC and using the GPU to rapidly encode the video stream to H.264 to stream to the shield while redirecting all of the input on the shield back to the PC. This, for the most part, works very well. Up to this point it's been [email protected] in general. The new beta of GeForce Experience has enabled both [email protected] as well as [email protected] streaming to the Shield. Additionally, the new nVidia Network Streaming service no longer kills the game if the monitor is powered off or detatched from the PC. (this is crucial for this to work.)
This generally works out of the box if you already have a 4k monitor connected to your computer. However, most of us probably don't. So here's a relatively easy way to get true 4k gaming in your living room.
Prerequisites:
GeForce Experience Beta Release which you can get here: http://www.geforce.com/geforce-experience/gfe-beta
nVidia GTX 970 or better (or SLi equivalent.) (My opinion just to be playable.)
Ethernet connected Shield or a really really good MIMO 802.11ac bridge (which is was I use.)
If you already have a 4k monitor attached to your computer and you're already able to play games at 2160p then you're done, just go to your shield, set your game resolution and play and enjoy.
However, if instead you have a field of 1080p monitors and no 4k to speak of here's what you need to do.
Open nVidia Control Panel
Disable nVidia Dynamic Super Resolution
Go to the "Change Resolution" screen and select your monitor.
Click the "Custom" button, then create a custom resolution.
The custom resolution should match that of your TV regardless of what monitor(s) you have attached. In most cases this will be [email protected] If you have an earlier 4k tv model that only supports 30hz on it's HDMI port, then change the refresh to 30hz. This needs to exactly match the current refresh rate that the Shield is running as it's connected to your TV for the best quality and smoothness of the video stream.
Click test, your monitor will go black and you won't be able to see anything. This is key, hit the right arrow on your keyboard TWICE then hit enter. This tells the app that "yeah, that resolution looked ok," even if you didn't see crap.
Repeat this step for EVERY monitor in your setup ensuring that all the settings are exactly the same. This is crucial or your game will NOT be able to see the custom resolution when it's time to turn up the pixel count!!!
Once you've created the custom resolution for your monitors, restart the nVidia Network Streaming service. Then, if you like or you're afraid of breaking your monitors by sending out of range resolutions to them (which unless they're CRT's you shouldn't be worried) you can turn your monitor(s) off. This is the part that matters here. nVidia has made it so that the card can render headless which is huge for this. Potentially opening up to being able to render complete unbeknownst to a user using the workstation physically. That's beside the point.
Head over to your shield, fire up a game, go into the settings and you should see the 4k resolution you configured earlier. Fire it up and you're good to go. Expect to see around 21mbit/s bandwidth used.
Hit me up in this thread if you have issues.

4k gamestream issues
Hi there, I am having issues getting my Nvidia Shield TV to stream games at 4k. I just tried this steps on my (ultra wide 3440x1440) but when I Test it says the test failed and my display does not support the custom resolution. Is there any other way to force 4k into the Shield TV?

MForce22 said:
Hi there, I am having issues getting my Nvidia Shield TV to stream games at 4k. I just tried this steps on my (ultra wide 3440x1440) but when I Test it says the test failed and my display does not support the custom resolution. Is there any other way to force 4k into the Shield TV?
Click to expand...
Click to collapse
Same with me. Using the 2017 Shield TV with latest update.

What if I just enable DSR and choose 4k in the game?

DSR works as well. But the aspect ratio is important (should be 16:9)

You no longer need to add a custom resolution or enable DSR, it will allow you to select 4k even if your monitor does not support it or if the monitor is off with the 2017 upgrade.
You *might* have to flag experimental features in the Geforce Settings though, not to sure on that. To activate it, make sure you set resolution to 2160p in the Quality settings on the Nvidia Shield for the PC.

Can someone with a non 4K monitor let me know if GameStream is still working fine?
A few weeks ago I had everything working properly. 4K would stream fine without having to change anything on my monitor. But for some reason it has stopped working. I can't figure it out and it's doing my head in!
The only thing I can think of that has changed recently was the Windows 10 Creators update. But surely that isn't it because I can't seem to find anyone else having issues.
I even tried reinstalling Windows and Geforce Experience from scratch. Still have issues!
When I try loading a game in GameStream, it doesn't change from my monitors resolution (1920x1200)
I've tried turning game optimization on and off, I've set the resolution in the GameStream settings to 2160p (and even tried all the other resolutions)
I've reset the shield to factory settings, I've even tried a different shield. Nothing is working.
Any ideas what is causing this?
Video card is a gtx 1070 (with latest drivers)
Monitor is a Dell 2407wfp-hc (1920x1200)
TV is a Samsung 55KS8000
Shield is the 2017 model (though I have also tried the 2015 model as well)

Mine does not work out of the box
PC = 34" Ultrawide 3440x1440 w/ 980TI
4k = Samsung UN75MU8000F w/ 2017 shield
4k = Vizio p65-c1 w/ 2015 shield
I am messing with Custom Resolution Utility from https://www.monitortests.com now to try to get it to work

Hello !
Few years later... I've got a Nvidia Shield 2019 and I will try those things to play in 4K on my TV while my PC's monitor is 2K.
BUT... I read that, few years agao, Nvidia provide the ability to shutdown PC's monitor w/o turning off the gamestream connexion, right ?
Nowaday, if I turn off my PC's monitor, it stop the gamestream connexion... what am I doing wrong, please ?

Related

[Q] Kindle HD 8.9 problem with HDMI output resolution

I have a simple question and to be honest, I am a little surprised that googling the issue resulted in nearly no hits. So I am trying my luck now here in the forum:
Part of the reason to buy the Kindle Fire HD 8.9 was for me to be able to use it as a playing device for movies on my HDTV. I also bought the official amazon HDMI to Micro HDMI cable to connect the kindle with my TV. The problem I am now having is that my HDTV (Philips 52PFL9632D) produces two vertical black bars to the left and the right of the picture, i.e. it doesn't offer the full 1920x1080p resolution, but something a little smaller (I guess around 1700x1080p).
Yes, I tried all the scaling options on my TV to no avail. And my Xbox, my BluRay-Player and my Dell Notebook do produce a flawless Full HD 1920x1080p experience on that TV Set on the exact same HDMI port. So it must be a problem of the HDMI output of my Kindle.
I suspect that the reason for that behaviour is the fact that the Kindle 8.9 display features a resolution of 1920x1200, which is not 16:9 but 16:10.
So as a result my TV probably squeezes the picture to fit it and thus produces the black bars left and right. It also doesn't matter if I am playing a video using VLC, surfing the web with Silk or just display the Kindle App selection screen on the kindle. There are always the black bars left and right.
(BTW: amazon customer service acknowledged the problem, but couldn't find an immediate resolution and promised to call be back.)
Does anybody have any idea what I can do to get the full resolution on my HDTV? I was so looking forward to not having to unplug my Notebook everytime and hook it to the TV, but being able to use my Kindle for watching movies instead.
What I am really surprised about:
1. Why does amazon produce a default signal output on the HDMI port larger than 1920x1080 anyways? Most people probably would want to hook it up to their TV and except for some monitors I don't know any TV featuring a resolution of 1920x1200.
2. Why does the kindle not offer any configuration option to at least choose the HDMI output?
3. Why is the net basically not producing any results when I am looking for a solution for that issue? Am I the only one hooking up his Kindle to his HDTV? Or am I the only one having that problem? Or am I the only one being annoyed by seeing two vertical bars on the TV?
ANY help how I can get rid of the vertical bars would be greatly appreciated!
I am going to throw a guess at why its not full 1080, someone can correct me if I am wrong because like I said this is just a guess. From what I heard, the kindle fire HD 8.9" model uses a set amount of ram towards the GPU to render out in the resolution it uses, and from what I read it was just barely enough memory for it to use that resolution, so using anything higher than that would probably not be possible. Lemme see if I can find where I read this.
Edit: found it!
ignoramous said:
The wallpaper service public APIs were selectively disabled to alleviate the memory pressure on system_server. Amazon's default Kindle Launcher doesn't need wallpapers (yeah I hear you when you say apps with FLAG_SHOW_WALLPAPER however do need wallpapers... but that functionality was apparently deemed not *that* important).
We freed about 14MiB+ worth RAM from system_server. This change was primarily targeted at the 8.9 Kindles, as 1GiB RAM was barely holding up to Kindle's massive resolution.
I see that having a bool in framework had work cut out for you guys... so I guess props for us to have made it "configurable" eh?
Click to expand...
Click to collapse
Sent from my Amazon Kindle Fire HD running CM10.1 Tablet UI using xda-developers app
Interesting theory. But I read elsewhere that amazon claims the output of the HDMI socket of the Kindle Fire HD 8.9 would be FULL HD.
I think that the output is in fact 1920x1200 (equaling the native resolution of the kindle display) and thus even MORE than full HD, which is only 1920x1080. And my TV is trying to sequeeze the full 16:10 information into the 16:9 screen and as a result has to display two vertical black bars left and right. So I am looking for a way to actually REDUCE the output of the HDMI socket to the lower (HD standard) 1080 instead of 1200 pixels.
Well I seriously have doubts about this working on amazon's stock os, but supposedly the full version has experimental support for HDMI resolution changing. Give it a shot, though if it doesn't work on Amazon's os I'm more so convinced it will on cm. https://play.google.com/store/apps/details?id=com.nexter.miniscalerfree
Sent from my Amazon Kindle Fire HD running CM10.1 Tablet UI using xda-developers app
Same Issues
I actually have had very similar problems using my Sony television. I'm curious if you ever found a solution.
stunts513 said:
Well I seriously have doubts about this working on amazon's stock os, but supposedly the full version has experimental support for HDMI resolution changing. Give it a shot, though if it doesn't work on Amazon's os I'm more so convinced it will on cm. https://play.google.com/store/apps/details?id=com.nexter.miniscalerfree
Sent from my Amazon Kindle Fire HD running CM10.1 Tablet UI using xda-developers app
Click to expand...
Click to collapse
My Kindle is rooted, so maybe it works. Thank you very much for the great tip.
I asked the question in the ROM-AOSP CM10.2 forum, since they even have a line under features saying
"HDMI video out is now correct and work across several resolutions. YMMV"
but so far nobody in that forum cared to answer...
I will try out your tip asap and report the results here.
Still I am wondering why I only found so few hits for this issue including in the amazon.com kindle discussion forum.
If nobody uses their kindle to connect it to their TV, why is amazon selling their own branded version of MicroHDMI-HDMI cables?
Ok. Time for an update. I think I am a step closer to a solution.
First I tried the app "Resolution Changer Lite" as recommended by stunts513. In theory it looked great. I was able to select a resolution 1920x1080 and as a result the kindle had a black bar at the bottom and only used 1920x1080 of the 1920x1200 pixels on its display.
So full of expectation I connected the Kindle with my TV. Unfortunately the TV still displayed black bars to the right and left but in addition now also a black bar top and bottom. So while it indeed reduced the displayed area, somehow the Kindle seems to still send the 16:10 information throught the HDMI port, making my TV scale down the picture. I tried all kinds of settings and resolutions to no avail.
As a next step I bought the full version of "Resolution Changer", because it was talking about an additional HDMI feature. Again full of expectation I started the app. Instead of selecting one out of several preset resolutions (like 1920x1080 or 1280x720) the app allows you enter any values for Width and Height. I again tried 1920x1080 with the exact same result as in the lite version.
But the full app has a second tab called "HDMI" and I did the same thing there. Unfortunately again to no avail. After some testing and reading of the description I came to the verified conclusion that all this additional feature does is DETECT if somebody connects a HDMI cable to the port of the Android device and then AUTOMATICALLY changes the screen resolution to the values chosen. So instead of having to invoke the app manually every time you in theory just set the wanted values on the 2nd tab and any time you plug in your TV your android device switches to the defined resolution.
Unfortunately there is no value to edit to stop the Kindle from sending out a 16:10 information to the TV or the app maybe doesn't even change anything that is sent to the HDMI port, but only influences the built-in screen.
But then unexpectedly Amazon's Customer Service sent me a message! They basically confirmed that if you connect your Kindle to a TV through HDMI it sends out it's 16:10 resolution and thus makes your TV scale down the picture.
BUT when you switch to the NATIVE INTERNAL VIDEO APP, the signal sent to the HDMI port automatically switches to 1920x1080 and a 16:9 ratio information. So as a result the native amazon services like "instant video" or other streams will be displayed without any bars. They claimed to even had it tested on a LG Full HD Plasma.
THIS EXPLAINS WHY MOST PEOPLE DON'T HAVE ANY PROBLEMS and thus you can't find much about in the forums.
I have to say I was sceptical, but since I had only used VLC so far to play back the movies, I now used the original video player. And to my surprise the black bars left and right went away and I had a full screen picture!!!
That's why I am saying, I am one step further!
Unfortunately this only helps me partly. For starters VLC plays a lot more formats than the native video player app. In addition I like to watch the movies with subtitles since I am not a native english speaker. VLC plays them automatically if the .srt-file is named the same as the movie itself, but of course the native video player doesn't. I don't even know it is capable of that.
Anyhow the big questions I have now:
1. does somebody know how to make the regular android Video Player display subtitles for example in the format .srt?
2. is there an option in the Android version of VLC (I don't think so, since it is still BETA) to configure the HDMI output?
3. Now that we know, that the regular Video Player can influence the format sent to the HDMI port, is there an app to set the HDMI output permanently to 1920x1080 and more important 16:9 ?
Again any help appreciated...
Ah, I wish I had known you were using vlc I would have suggested a few things. It didn't occur to me you might be using it because I got my from the play store, but it said it wasn't available in the USA, so I had to proxy through another country to get it. Anyways, I know in vlc's options there's a few different output methods, though I don't think that will affect it. Another thing I was thinking of is in vlc, u can change the aspect ratio, I believe the button is on the bottom left above the playback bar while playing a video, when u tap it it switches they various modes, don't know if this will affect the HDMI at all though, as I don't have a minihdmi cable because most of my TV's are CRT, only 2 of them have HDMI plugs, and of those 2 only one has 1080 capabilities, and its a PC monitor so it has no sound output.
Sent from my Amazon Kindle Fire HD running CM10.1 Tablet UI using xda-developers app
Hashcode's CM10.1 and HDMI 1920x1980
Hi Tronar,
Saw your post (http://forum.xda-developers.com/showpost.php?p=45044803&postcount=579) but I'm not allowed to answer in that thread.
I have Hashcode's CM10.1 (20130812) running on my Kindle Fire HD 8.9. I connected the Micro HDMI port to my Samsung Full HD TV, and also saw the 2 vertical black bars (about 2 cm). Using apps like Resolution Changer doesn't help, as I get some weird screen flipping, turning landscape into portrait.
Cheers, S
stunts513 said:
Ah, I wish I had known you were using vlc I would have suggested a few things. It didn't occur to me you might be using it because I got my from the play store, but it said it wasn't available in the USA, so I had to proxy through another country to get it. Anyways, I know in vlc's options there's a few different output methods, though I don't think that will affect it. Another thing I was thinking of is in vlc, u can change the aspect ratio, I believe the button is on the bottom left above the playback bar while playing a video, when u tap it it switches they various modes, don't know if this will affect the HDMI at all though, as I don't have a minihdmi cable because most of my TV's are CRT, only 2 of them have HDMI plugs, and of those 2 only one has 1080 capabilities, and its a PC monitor so it has no sound output.
Sent from my Amazon Kindle Fire HD running CM10.1 Tablet UI using xda-developers app
Click to expand...
Click to collapse
Well, plenty of stuff to respond to.
First of all VLC was available in the regular Kindle Store. So I didn't even have to root my phone and install the Google Playstore to be able to install it.
In addition VLC is still beta. The configuration options are quite minimal compared to the PC version.
And of course I tried the various screen size and ratio options in VLC. But they only affect the way the Kindle display of 1920x1200 is used, i.e. how the available pixels are filled with the video. So you can achieve all kinds of effects using only part of the Kindle display. But whatever option I chose, the black bars left and right always remained on my TV. To make it simple all I could do was further add more black bars by shrinking the image or distort the image by changing the ratio. None of it seems to have any influence on the HDMI output. I think it is a very special option or feature, the regular Kindle Video Player is using there and would have to implemented explicitly in the VLC player.
Nonetheless this gives me the idea to send some feedback to the VLC developers to add this feature to their wish list. :good:
Anyhow, I really appreciate your help. Please keep the good ideas flowing.
smeitmeister said:
Hi Tronar,
Saw your post (http://forum.xda-developers.com/showpost.php?p=45044803&postcount=579) but I'm not allowed to answer in that thread.
I have Hashcode's CM10.1 (20130812) running on my Kindle Fire HD 8.9. I connected the Micro HDMI port to my Samsung Full HD TV, and also saw the 2 vertical black bars (about 2 cm). Using apps like Resolution Changer doesn't help, as I get some weird screen flipping, turning landscape into portrait.
Cheers, S
Click to expand...
Click to collapse
Well, it seems like nobody is allowed to answer in that read, because so far I haven't received any answer.
Neither in the CM10.2 nor in the CM10.1 thread, where there seems to be at least some kind of life.
Thanks for confirming that the Resolution Changer doesn't help on the CM10.1 ROM. That saves me hours of work of trying out that app on this ROM.
But my question was actually a little more specific. The first page of the thread states under Features:
"HDMI video out is now correct and work across several resolutions. YMMV"
I was wondering if that means that there is some way in this CustomROM to change the resolution of the HDMI video out signal.
Update:
I tried the App "MX Player" to see, if that app would behave more like VLC or more like the native Video Player.
I don't know if I should laugh or cry, but this is what happened:
I started a movie using the freshly installed MX Player and was happy to see, that the subtitles were showing on the Kindle Display. I connected my Kindle with the Micro-HDMI-Cable to my TV and was also happy to see that the vertical black bar were NOT showing, but instead it was using the full resolution of my TV. I was totally excited. And then it hit me:
While the subtitles were showing on the Kindle screen, they were NOT showing on my TV!!! Otherwise both screens showed an identical picture, but the subs were missing.
So I tried out all kinds of options in the subtitles settings. To no avail.
Then I tried switching from using the Hardware accelleration to Software accelleration. The app offers three modes called SW, HW and HW+
And here comes the funny part:
When I switched from any of the two HW modes to the SW modes, suddenly my TV was showing the subtitles. And for a moment I was like "Yeah, I finally found the solution". And then it hit me: now my TV was showing the vertical bars again.
My conclusion from this test:
When you switch to software acceleration, MX Player behaves like VLC. You get the subtitles on your TV, but you have to live with the vertical black bars effectively reducing the resolution.
When you switch to any of the two hardware acceleration options, MX Player behaves like the native Video Player and gives you the correct 16:9 1920x1080p output on the HDMI-out. But for whatever reason the subtitles are only displayed on the Kindle screen and not on the TV any more.
My guess: by using the HW acceleration, the app automatically triggers the built-in feature of reducing the resolution from 1920x1200 to the standard 1920x1080 on the HDMI-out. Apparently amazon has built in that feature into the Hardware acceleration of the Kindle.
I don't know why this kills the subtitles. Either it is a bug in the app or for some reason the Hardware acceleration does not allow the blend-in of the subtitles over the video.
And when you switch to SW acceleration, the Kindle apparently thinks of the app like any other standard app and sends the full 1920x1200 resolution not only to its own screen but also to the HDMI-out.
It would be great to know, if superimposing subtitles is technically impossible when using the HW acceleration because the Kindle chips do not support that at all or if it could be achieved with the right algorithm.
I am unsure if it is the exact same thing offhand but vlc has an option to enable hardware decoding, if it is the same thing try it and see if vlc works . also I don't know much bout HDMI, but if its a digital stream and subtitles aren't showing up, try enabling captions on your TV and see if anything happens, I would think that they would be part of the digital stream in one way or another.
Sent from my Amazon Kindle Fire HD running CM10.1 Tablet UI using xda-developers app
Thank you for the tip with Hardware Decoding option in VLC, but unfortunately it doesn't seem to make any difference. The picture remains shrinked with vertical bars left and right, no matter if you enable that setting or not. The subtitles also remain visible in both settings.
So my guess is that the current VLC beta isn't really enabling the Hardware Decoding on the Kindle. At least not the way, MX Player and the native Kindle Video Player seem to do it.
About the subtitles still being part of the digital stream in the HDMI output: sounds like a great idea as well.
So I tried it out by activating the subtitle setting on my TV. Unfortunately to no avail. It MIGHT be, that the way US-TVs receive and display subtitles from TV broadcasts is different than the one used in Europe (here subtitles are sent through a quite old system called Teletext oder Videotext (depending on the country)).
Or it might be, that the HDMI protocol just does not support this way of transfering subtitles to the TV.
In any case, unfortunately I am not any closer to solving the issue.
But thanks for the good ideas and tips, please keep them coming as I will keep attacking that problem.

Is there a way to output 1080i?

I have an older tv that doesn't support 1080P but will do 1080i but found out last night the unit won't output 1080i. Is there any way I can force this thing to output that? I don't have a rootable unit but would consider doing the solder method if I need to. Also is there a way i can set 1080i in kodi/xbmc? Also I have seen converter boxes that look like they will do 1080p to 1080i has anyone used one of these?
Thanks
lpstudio said:
I have an older tv that doesn't support 1080P but will do 1080i but found out last night the unit won't output 1080i. Is there any way I can force this thing to output that? I don't have a rootable unit but would consider doing the solder method if I need to. Also is there a way i can set 1080i in kodi/xbmc? Also I have seen converter boxes that look like they will do 1080p to 1080i has anyone used one of these?
Thanks
Click to expand...
Click to collapse
I would set it to 720p as there's not really that much difference, although it should set itself when you first turn on the AFTV?
You can then set Kodi to 720p by going into System - Settings and choosing 1280x720p in the video output display, although hopefully that will be set automatically as well...
wonneil said:
I would set it to 720p as there's not really that much difference, although it should set itself when you first turn on the AFTV?
You can then set Kodi to 720p by going into System - Settings and choosing 1280x720p in the video output display, although hopefully that will be set automatically as well...
Click to expand...
Click to collapse
In Kodi it looks like it is locked at 1080. I have heard the fire Tv runs everything at that and down scales everything to what your tv can handle.
lpstudio said:
In Kodi it looks like it is locked at 1080. I have heard the fire Tv runs everything at that and down scales everything to what your tv can handle.
Click to expand...
Click to collapse
I've never actually tried 720p but according to the AFTV technical details it outputs in both 720p and 1080p...
http://www.amazon.co.uk/Amazon-CL1130-Fire-TV/dp/B00KQEJBSW#tech
Not sure about the Kodi/AFTV combi though.
Same exact problem, unable to sync at 1080i
I have the same exact problem, my TV set is capable of 1080i only, and at 1080p it wont sync!
I can argue against those suggesting 720p (as i did) that the video quality is not so good,
because 1080i is the "native" resolution of my TV, and 720P scales down to slightly blurred images.
Also, at 720p you usually have some "borders" or "crops" at the edge, almost impossible to compensate
using TV set controls (and, incredibly, no options on fire stick to zoom in/out for frame borders).
Thus, i'm waiting the new firestick 4K (that supposedly have 1080i support also), and using my firestick
basic at 720p resolution. BUT if it exist a trick to sync at 1080i i would be much more happy with it.
I contacted amazon support, but they barely understood my question.
720p is far superior to actual 1080i, which is equivalent to 540p data-wise. Put it at 720p and forget it.

3D movies using Kodi on Fire TV

Not sure if my issue is Kodi related or Fire TV related, but 3D MKVs (1080p SBS) do not work well on my Samsung TV in 3D mode through Kodi. They do display in 3D once I change the TV into that mode, but there's ghosting. Playing the MKV using the TV's built in Media Player has always worked fine. I suspect Kodi isn't converting the SBS video completely correct. OTOH, I had to calibrate the Fire TV due to my TV's overscan, so perhaps that has something to do with it? I'm not sure. Does anyone have any tips for getting 3D video to work?
You´re TV is using additional filters etc. when the playback is run internally but the TV does not use these for external input via HDMI. It´s most likely you would see this also with other external players.
Concerning overscan: when you don´t need it (for e.g. you´re not using any analogue video inputs), set it to 0 in you´re TV or deactivate overscan. It´s the better solution than to calibrate all attached devices for overscan which isn´t needed for digital HDMI input normally.
Calibaan said:
You´re TV is using additional filters etc. when the playback is run internally but the TV does not use these for external input via HDMI. It´s most likely you would see this also with other external players.
Concerning overscan: when you don´t need it (for e.g. you´re not using any analogue video inputs), set it to 0 in you´re TV or deactivate overscan. It´s the better solution than to calibrate all attached devices for overscan which isn´t needed for digital HDMI input normally.
Click to expand...
Click to collapse
That's interesting because I have two different TVs connected through HDMI that both required the image to be reduced 2-3% with the Fire TV. That ought to be digital. Am I missing something?
EDIT: Okay, here's what I was missing. Both TV sets were NOT set up to the optimal screen settings. The Samsung was set to 16:9, which you'd think would be the right setting, but it's not. The "User setting" with the option to move the picture horizontally and vertically is the "correct" one to use. On the other TV that I was using overscan on had a similarly poorly worded option called "Fullscreen 100%" or something like that that gave the actual resolution instead of the default "Normal" setting. Unbelievable!
The happy ending to this story is that 3D movies work perfectly. I hear there is a plugin for Kodi to automatically switch the TV into the correct 3D mode assuming the MKV file has the 3D encoding option specified. (I went through and fixed all the MKV metadata to include it.) I'll try to find it and test it out.
Those overscans mechanism do imply scalers which may also corrupt the proper displaying of 3D content. I wonder why still today TVs are preprogrammed with overscan for digital inputs.
But you´re 16:9 issue isn´t clear for me. A 16:9 TV should display everything at 100% or is it a non 16:9 TV, for e.g. with 21:9 aspect ratio? In that case the TV probably tried to stretch the 16:9 content instead of letterboxing which may also lead to a bad 3D behaviour.
Calibaan said:
Those overscans mechanism do imply scalers which may also corrupt the proper displaying of 3D content. I wonder why still today TVs are preprogrammed with overscan for digital inputs.
But you´re 16:9 issue isn´t clear for me. A 16:9 TV should display everything at 100% or is it a non 16:9 TV, for e.g. with 21:9 aspect ratio? In that case the TV probably tried to stretch the 16:9 content instead of letterboxing which may also lead to a bad 3D behaviour.
Click to expand...
Click to collapse
The default option on my Samsung 55" TV is "16:9". There's also "Wide" (4:3 stretched to fill the screen) and "Zoom" (4:3 zoomed so the top and bottom are cropped). It's not until you go to the "User" option that you realize that the default "16:9" option (the first setting) is actually zooming the image 2-3%! I can only suspect that Samsung assumed people would be watching a mix of analog and digital sources. You'd assume the cable company's box would take care of any issue with displaying analog signals to a digital TV, but I guess not. The moral of the story here is to not assume the TV isn't implementing an overscan on ALL signals by default.
EDIT: Apparently, my TVs are the only ones like this: http://www.engadget.com/2010/05/27/hd-101-overscan-and-why-all-tvs-do-it/

Fire TV 4K stick: auto framerate siwtching only works once :(

Hello,
my new Amazon frie TV stick 4K is attached to a Denon AVR 1912, and this one to a Panasonic LCD TV set. My problem is that I would like the stick to switch the framerate (24Hz, 50Hz, 60Hz, etc.) automatically according to the source framerate automatically, which can be set in the settings menu of fire TV stick accordingl, but the problem is that this setting only works once after it has been selected. If you hibernate the stick and it wakes up again, it will be stuck to 60Hz all the time, no matter what source framerate your video has. It stays that way until you change *something* (no matter what parameter actually) in the fire TV video settings, i.e. switching from "HDR" to "disable HDR", then the auto framerate works again, until you hibernate it the next time, and so on, which is very uconvenient.
You can also see it clearly when you install (sideload) Kodi to the stick, the whitelist of possible resolutions will normally offer a great variety, but when the stick is stuck at 60Hz, also Kodi will only allow this one refresh rate, and the same is true for all other apps on the stick that make use of framerate siwtching.
I have read a lot on the net that some people only have that problem when they are using a AVR (like in my setup), but for some people it also happens with the stick directly attached to their TV. So it seems Amazon devs have completely messed up this essential feature, which I was missing in the previous Fire TV sticks completely. Now I was happy to have that feature just to see that it isn't working properly Very poor performance, Amazon. Anything one can do against it? Any way to create a log and send it to Amazon devs to fix that problem?
Refresh rate is not equal to framerate.
If there is such a dramatic difference using regular pulldown, you might need a new tv.
No media is created with variable refresh rates in mind.
TimmyP said:
Refresh rate is not equal to framerate.
If there is such a dramatic difference using regular pulldown, you might need a new tv.
No media is created with variable refresh rates in mind.
Click to expand...
Click to collapse
Sorry, but I think you totally did not understand my post: I am using various apps on the FireTV Stick, i.e. Kodi, using media with various framerates. And there is of course a relationship between the framerate a media was created and a suitable refresh rate which is in the best case the same or an integer factor to the framerate of the video (i.e. if a video was recorded with 25fps, then 50Hz would be a good choice for autoframerate, 60Hz would not be a good choice!). But this is not the point to discuss here.
The point is that the Fire TV Stick can only select an appropriate refresh rate once it has been set in the video options of the stick, as soon as it goes to hibernate once, it will be stuck at that refresh rate, no matter which other media is played. This does not only affect Kodi, it afffects, as I wrote, all the apps on the fire TV stick. And as soon as you select any other option in he video options, it again works - but only once!
It doesn't matter. There is absolutely no point to watching video with a refresh rate lower than the highest you have available.
You need to be sure in kodi settings -> system -> video -> refresh rates You see non standard refresh rates !
I had some problem while connected to AVR 5.1 not showing 50fps or lower
For 4k material auto framrate doesn't work so to have it enable for movie put some small fullhd starter with same fps as 4k movie played. Works for me all the time. I've cuted out 3s clip from my movies via ffmpeg
None of this matters. Any difference you see is completely placebo. There is never a circumstance where you do not want to use the pulldown your tv gives you.
edit: passtrough for everything, as much as possible. Unless there are specific *firetv-software-only needs.

UK IPTV Recording...

Hi all
I have the stick & the pro and have been using for IPTV, Netflix, YouTube etc mainly but I have moved house and have a lack of aerial so my old BT box that I used for freeview and recording is of no use. There is a satellite dish and LNB.
So I was wondering if I can use the shield for this either by connecting the LNB somehow or via an IPTV player that records. So a couple of questions...
As I understand it there is no way to connect the shield to the LNB and use it like that, is that correct? If I am wrong how do I do that please?
Secondly if I cannot do the above, or even if I can, would an IPTV player app or similar with the facility to record be a better option. If so does anyone have any recommendations please? I do not mind paying for the app as long as it works, I have read good things about the 'IPTV Tuner' app but cannot find anywhere if it records or just for viewing!
I've seen this but it has not been commented on for some time:
nVidia Shield Android TV
nVidia's Shield Android TV is an Android TV based set-top box, targeted at both home entertainment and gaming. The nVidia Shield Android TV was released in May 2015, and is the third-generation Shield hardware device. It can output 4K resolution to TV over HDMI and supports high definition...
forum.xda-developers.com
Thanks to all for reading.

Categories

Resources