Radeon GPU Temperatur mit VirtualSMC

  • @Aluveitie bekomme ein Panic, bei deinem Kext.

    Habe es an direkt nach Lilu und an der Letzen verschoben, ohne Erfolg.

    EFI lade ich bei Google drive hoch, da zu groß fürs Forum.

    Link:

    https://drive.google.com/file/…PsBGZwbJ/view?usp=sharing

    Windows: AMD 3900X | GIGABYTE AORUS X570 Master | 16 GB GSkill Trident Z DDR4-3600 CL18 | MSI RX 6700 XT Gaming X | Auzentech X-Fi HT HD | WD SN750 | Corsair RM750i

    MAC OS: AMD 2600 | GIGABYTE AORUS B450 i Pro | MSI RX 480 Gaming X | 16 GB AORUS DDR4-3200 CL16 | 500 GB M.2 SSD | Enermax Platimax D.F | MacOS X 11 + OC

  • Gordon-1979 ExecutablePath fehlt

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • @Aluveitie , THX. Funktioniert, Temp wird angezeigt.

    Windows: AMD 3900X | GIGABYTE AORUS X570 Master | 16 GB GSkill Trident Z DDR4-3600 CL18 | MSI RX 6700 XT Gaming X | Auzentech X-Fi HT HD | WD SN750 | Corsair RM750i

    MAC OS: AMD 2600 | GIGABYTE AORUS B450 i Pro | MSI RX 480 Gaming X | 16 GB AORUS DDR4-3200 CL16 | 500 GB M.2 SSD | Enermax Platimax D.F | MacOS X 11 + OC

  • Benötige ich für den KEXT zwingend LILU und VirtualSMC - oder kann ich auch nur LILU und FakeSMC nutzen?

    Ich würde nur sehr ungern auf "VirtualSMC" wechseln, da ich damit unter iStatMenu wesentlich weniger Werte für CPU-Cores und Temps angezeigt bekomme, als bei Nutzung von FakeSMC.

    ASUS WS X299 SAGE/10G • Intel Core i9-7920X 12-Core 2.9GHz • 128GB RAM • ASRock Radeon VII Phantom Gaming • 2x Samsung 980 NVMe M.2 SSD 1 TB
    Custom Wasserkühlung • Thermaltake TheTower 900 • 1x SAMSUNG 49" @ 5120 x 1440 (100Hz) via DP • LG OLED 55" TV @ 3840 x 2160 (100Hz) via HDMI
    WINDOWS 11 ENTERPRISE INSIDER (PRO950 NVMe) • macOS BIG SUR und MONTEREY latest Build (jeweils auf Samsung 980 NVMe) • OpenCore always latest

  • Mork vom Ork Die Kext hat eigentlich nur eine Dependency auf Lilu. Die Kext ist eigentlich nur RadeonMonitor ohne FakeSMC dependency um sie auf VirtualSMC zu nutzen. Wenn du schon auf FakeSMC bist kannst du auch das original nehmen :)

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • Ist auch nur ne Skizze, keine Präzision. So vielleicht in dunkel, ab einen Schwellwert in rot? :)


    bei den Werten wäre was dolle faul 😅


    Mal im Ernst: find ich sehr gut so, macht was her und ist total gut nutzbar.

  • Aluveitie nach Sleep funktioniert es immer noch.

    Windows: AMD 3900X | GIGABYTE AORUS X570 Master | 16 GB GSkill Trident Z DDR4-3600 CL18 | MSI RX 6700 XT Gaming X | Auzentech X-Fi HT HD | WD SN750 | Corsair RM750i

    MAC OS: AMD 2600 | GIGABYTE AORUS B450 i Pro | MSI RX 480 Gaming X | 16 GB AORUS DDR4-3200 CL16 | 500 GB M.2 SSD | Enermax Platimax D.F | MacOS X 11 + OC

  • Aluveitie

    Will you restore max temp value in future or it is not in your gadgets feature plan?

  • fabiosun My plan is to have frequency in its place if I can get that working. I didn't find max temp too useful after all.

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • I am studying my max temperatures using Davinci resolve to see in that condition my GPU throttles ..GPU works well and I have no problems at all but I suspect it throttles..and I don't like this


    I think it is throttling also in Luxmark3.1


    By the way , your 6900xt is a reference AMD card?

    If so, have you posted your power table settings?

    I have some problem in windows to follow interesting guide posted in this forum

  • I have a reference 6900 XT, though it is watercooled and undervolted.

    I don't publish it in my Github repo because it is not guaranteed to work on all cards.


    As soon as I applied just a little undervolt it would crash. For the 6000 series you have to limit the max frequency together with the core voltage. I lost about 5% performance, but power draw was down to around 250W, not too bad. The 1% low would be down a little more I guess since thats where peak frequency usually applies.


    So AMD is tuning them much better than they did in the past when they sometimes came with ridiculous high voltage, like the VII.

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • Aluveitie

    your Powertable settings working well here

    About 10* less in 66 nodes (candle Benchmark) maybe about 7% slower using this frequency and undervolt


    Maybe I ask if you have refined temperatures reading, I am asking because I can't see now GPU on its maximal value (95° for hot spot sensor without powertable setting)


    However thank you so much..I have to study better "how to" to find less conservative value than yours...

    Very happy about it.. almost like when I managed to get OSX working with my 3970x long ago :)

    Vielen Dank



  • fabiosun It's pretty simple though. Start Unigine Valley and use MSI Afterburner to try different settings. With the HUD you can track frequency, temperature and power usage as you tune the sliders.

    Once you find the settings that suite you and don't crash the benchmark you can enter them into the MorePowerTool and write the SPPT to the registry and extract it from there.

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • Most cards accept 1075mV SoC Voltage, which drops the TBP more down for either more Clock or Powersaving, with MPT settings you’ll get more accurate Settings.


    That’s where you can define max Core Speed and Voltage. So your Windows Driver or RadeonSoftware doesn’t care about its own settings.


    That’s much more effective and gives you about 170W on an 6800, 190W on an XT and not much more than 220-230W on an 6900 as Aluveitie said.


    But with SoC down to 1075mV you’ll have more Wattage left for the Chip which ends in higher Clocks.

  • Die neuste Version 0.2.0 bringt App Icon, Kompatibilitätscheck und Unterstützung für die RX 6600 XT


    https://github.com/aluveitie/RadeonSensor/releases/tag/0.2.0

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • Mit Release 0.3.0 gibts nun auch eine SMCRadeonGPU kext um die GPU Temperatur an VirtualSMC zu exportieren damit Monitoring Tools wie iStats, Sensei etc das auslesen können.

    Wichtig, SMCRadeonGPU benötigt die VirtualSMC und RadeonSensor kexte und muss nach beiden geladen werden.


    https://github.com/aluveitie/RadeonSensor/releases/tag/0.3.0

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal
  • Danke für das Update.

    Das mitgelieferte Gadget zeigt die Graka Temp an.

    In Macs Fan Control App wird der gleiche Wert unter GPU Proximity angezeigt. Mein aktuelles iStats Menus zeigt bei GPU Proximity allerdings nichts an.

    Siehe Screenshot.

    Kext Reihenfolge: VirtualSMC, RadeonSensor, SMCRadeonGPU.


    Kann ich hier was helfen bzgl. Debugging, Infolieferung, oder liegt das Problem bei iStats Menu? Da Macs Fan Control den Wert nun sieht (vor deinem Update war das nicht so), könnte ich mir vorstellen, dass iStats Menu den auch sehen könnte.

  • Danke fürs testen, kannst du mal folgende kext probieren?

    Dateien

    • Apple Mac Studio | M1 Ultra | 64GB RAM | 1TB
    • PowerMac G5 | Dual 2GHz | 8GB RAM | GeForce 6800 Ultra DDL
    • AMD Ryzen 9 3950X | ASUS Strix X570-I Gaming | 64GB DDR4-3600 CL16 RAM | Corsair MP600 M.2 NVMe | Radeon RX 6900 XT | Phanteks Enthoo Evolv Shift | Custom Loop | MacOS 12 | OpenCore
      Ryzen MacPro | EFI | RadeonSensor | Aureal