Show HN: Keep your PyTorch model in VRAM by hot swapping code

(github.com)

71 points | by valine 18 hours ago ago

7 comments

  • NitpickLawyer 13 hours ago ago

    We use python notebooks for that functionality in the early stages of script testing. Load a cell up top with the model, then do your stuff below, and once things look good convert it to a normal python script.

  • pizza 14 hours ago ago

    Tensor visualizer app itself already looks pretty interesting

    • valine 14 hours ago ago

      Thanks, I will do a deep writeup on that at some point.

      • kombine 12 hours ago ago

        Are you running both DearImGui visualisation and training locally? If not, how can one use it in the client-server mode? I think this is the most common requirement for visualisation libraries in Deep Learning.

        • valine 12 hours ago ago

          The rendering is done with OpenGL, and for remote viewing I just render to an offscreen framebuffer and stream it back to the client with WebRTC. The code for that isn’t public yet, still needs some cleanup.

    • iaw 5 hours ago ago

      Yeah, sadly the link to their visualizations is gated behind X.com

      • CheeksTheGeek 4 hours ago ago

        you can use xcancel.com by adding cancel after the x url