AppleTV X – The Mod 8


AppleTV X – The Mod

AppleTV X - The Mod

All Image Credit: AppleTV X

This power supply and hardware upgrade is for the AppleTV 4K. 

The AppleTV series of hardware use small and very efficient internal switched mode power supplies. The design of these types of supply are well known for the amount of noise and RF interference that they can create through radiation, on their DC power line output and backwards up its AC power cord. (See my MX-VYNL analog PSU upgrade here and here). If this noise enters video and audio digital circuitry it can cause jitter and phase noise on the clocks that drive the CPU, RAM, SSD, Ethernet controller and HDMI chip, potentially causing digital errors.

AppleTV X - The Mod

The AppleTV Switched Mode Power Supply Board

What this upgrade does is to:

  • Remove the inexpensive internal switched mode power supply and replace it with an external, modified, remote sensing, high quality analog power supply.
  • Add additional power supply filtering in the location vacated by the removed switched mode supply.
  • Add additional hardware improvements to:
    • Lower the phase noise on clocks
    • Cleans up supplies to chips
    • Power input termination impedance
    • Overall RF noise control

AppleTV X - The Mod

Power Supply Filtering Replacing the AppleTV Switched Mode Supply

According to its designer Chris Stephens, “these modifications radically remove noise on ALL busses/inputs/outputs/HDMI”, and “this de-jittering of the CPU/SSD/RAM/ETHERNET/HDMI as a whole is what makes it magical”. 

The improvement in power supply noise is said to “results in far lower error correction and concealment by the HDMI sink device (TV) which then provides far more of the original bits without loss”. The improvement in power supply noise can clearly be seen in the following two images.

AppleTV X - The Mod

According to the designer: “The overall result in picture and sound is startling.” 

All I can say is that if it improves the AppleTV video and audio quality as much as my analog power supply upgrade improved the audio quality for my MX-VYNL then the AppleTV X improvements should look and sound very good.

This custom upgrade is NOT for the financially faint of heart.

MSRP: $2,500.00


For more engineering information check out AppleTV X – The Magic.

Want to purchase then go to GTT Audio.


Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

8 thoughts on “AppleTV X – The Mod

  • Marty

    This may be a good device but it is antiquated from the outset now that almost every smart TV has a built-in Apple TV App. Is it $2500 better than that? Put a LPS on your cable modem or nearest satellite router that feeds your wireless TV signal and the answer is, well, perhaps a bit, but $2500 worth? Your call. I’ll pass. The picture quality using the built in Apple TV App on my 85″ Samsung is quite good. No need to watch Ted Lasso on anything better, IMHO

    • fromvinyltoplastic Post author

      Hi Marty,
      My post in not an endorsement of the upgrade. I was just trying to bring it to the attention of any readers who felt that a little DIY work with analog power supplies might benefit them.
      It is a great deal of money for what is probably a small benefit and just like you I am totally happy with my AppleTV 4K as it stands. I only use it for relaxed family room evening viewing on my 42″ plasma HDTV, never in my 4K HT which is a physical media only zone!
      Thank you for popping by and your comment.
      Regards
      Paul

  • Richard

    I’m sure this comment will be removed as ‘I just don’t understand’, but there’s no way that any power modification to a device can affect the image quality of a digital HDMI signal. It just doesn’t work that way.

    • fromvinyltoplastic Post author

      Hi Richard,
      I have never removed a comment, unless it was abusive. We are all entitled to our opinions. 🙂
      It is quite common for many to think that digital is digital and that “cleaning up” power supply noise and grounding issues etc. has no effect. Fortunately that is not true. I assume you understand that noise can and does impact analog amplifiers and in particular ADC and DAC by impacting Jitter, Quantization Errors, aliasing and in particular intersample peak clipping. This is easy to confirm with many analog amplifiers and ADC/DAC circuits by swapping their low cost noisy switched mode supplies for very low noise analog supplies. My site has a number of posts, based upon practical experience, on phono and head amps that fully confirms all the benefits.
      Well noise in digital circuits can also impact those analog “digital” sample words” particularly by increasing jitter and affecting timing. Ultimately these issues show up as image luminance and chrominance noise or loss of resolution in the image.
      Years ago I upgraded all my players to serial SDI output for both SD and HD video (I hate HDMI). This upgrade is nothing more than the addition of a professional SDI serializer chip. Now I had two outputs of EXACTLY the same digital video data one via encrypted HDMI the other via unencrypted SDI. You guessed it, the SDI signal provided better contrast, less noise, improved chrominance saturation and was sharper!! The ultimate conversion of digital data to an analog signal is always fraught with issues and controversies which with a little research and understanding can be understood.
      Unfortunately this is not the venue for a “deep dive” into this topic. However, it is safe to say that low noise powers supplies, optimized grounding and PCB layout will all significantly contribute to the quality of a video (and audio) signal, it does not matter wether it is digital or analog.
      Remember that these noisy, switched mode supplies are used for their high efficiency, low heat dissipation, low cost and small size NOT for their circuitry benefits.
      Thank you for popping by and for your comment
      Regards
      Paul

      • Gabor

        This is absolutely impossible, as this is not how image processing and transmission works with HDMI.

        1. The decoding of the H.264/HEVC stream is done by the SoC with the software running on it, and so we get three 8-bit RGB images (pixel-based digital image data) and a Pixel Clock (for this aspect, the audio is not interesting).
        2. This data is transmitted to the receiving device via the TMDS protocol

        I think everyone understands that a noisy power supply can’t interfere with the operation of software in a way that decodes the stream in a completely different way (especially not by changing the resolution, brightness, saturation, etc…).

        • fromvinyltoplastic Post author

          Hi Gabor,
          I appreciate your input.
          I did not mention that either the TMDS protocol or the SoC code is in some way impacted by clock/data jitter and noise, that is as you say probably “absolutely impossible”. It is the creation of the digital data in the source and its conversion to the final clock and analog video control signals in the display device that impacts the image quality. Clock/data jitter and noise can and does impact the DAC process, be it video or audio.
          Regards
          Paul

          • Gabor

            At any time, it is easy to prove that this is not true. You need to make a power supply that can control the noise level, even to extreme degrees, and switch it on and off during playback. Any transmission errors/noise can damage pixels and clock signal, but will show up as digital errors (sparkles, blocking). There is no analogue video signal in the chain, so there will never be any modification in the picture as you describe.

            • fromvinyltoplastic Post author

              Hi Gabor,
              It looks like we will have to agree to disagree as this is not the forum for an in-depth, ongoing technical discussion and I never mentioned bit errors only jitter. With that said, the only comparative video tests I have run were with my DVDO pro using HDMI, SDI and RGBHV I/O driving a BENQ W10000, with two modified Denon Blu-ray players with serial out. The out right winner for the better image was SDI to the DVDO and HDMI from the DVDO to the BENQ. The image was noticeably sharper with lower chroma noise when compared to using HDMI (or component) from the players.
              Changing to a low noise analog power supply was only barely observable if I was using RGB signals throughout.
              Regards
              Paul