It's fascinating to see how big-name studios have had such an influential impact on PC technology.
He wants it to be using as much of the M1 chip’s potential for these tests. The other thing he does is make sure to turn off as many other apps as possible. "The video providers (Netflix, Amazon Prime, etc.) don’t allow the videos to be played without DRM, and DRM is built into the T2.” Another Redditor supported that conjecture, adding that Windows users, too, cannot watch 4K content without a supported processor (Kaby Lake and newer) that features DRM technology embedded into the hardware. He happens to be using a 16GB Mac mini 16GB referring to the memory and Final Cut Pro. PS: that did not happened when I got the new Mac mini at day one, there perhaps has some software adjusting or. I connect the Mac mini using a HDMI cable to a 4k monitor. "This is not a choice by Apple," one Reddit poster theorized. So far as my situation, 4k Dolby Vision Netflix playback is test success on my M1 Mac mini using safari 14, Big Sur. Does anyone know why that is?”Ī recurring theory among Reddit posters is that while many T2-absent Macs can stream 4K just fine, Hollywood is gung-ho on pushing Digital Rights Management (DRM) protections against content infringement.
Their only requirement is a 7th-Gen Intel CPU or a dedicated graphics card.
For reference, hese are the Macs the pack a T2 security chip: 2018 or later MacBook Pro, 2018 or later MacBook Air, 2018 Mac mini, 2019 Mac Pro, iMac Pro and 2020 iMac. As pointed out by 9to5Mac, one Reddit poster asked, "Windows machines don’t have any kind of T2 alternative and are still able to stream 4K via Edge. Theoretically, older iMacs and pre-2018 Mac laptops can play 4K content but they’ll miss out on streaming Netflix in 4K due to the lack of T2 security chip. Although Netflix did not divulge why a T2 chip is required, fans of the streaming service are baffled.