6 comments

  • codetrotter2 天前
    It’s a very specific kind of feedback but, your app seems to not be aware of the way that the top bar works on for example iPhone 15 Pro.

    This results in part of the UI of your app shown on first launch ("Window Controls" menu, containing "App Tools", "Files", "Editor", "GitHub") being obscured by the black oval that surrounds the camera on the top of the display on these phones. This happens in portrait as well as landscape orientation.

    You’ll probably be able to see the same issue if you select the iPhone 15 Pro or similar model in the iPhone simulator in Xcode.

    • codetrotter2 天前
      Also, I tried to open Glyph3D.xcodeproj from the GitHub repo in Xcode and got an error message that says:

      > The project ‘Glyph3D’ is damaged and cannot be opened. Examine the project file for invalid edits or unresolved source control conflicts.

      > Path: /Users/user/src/LookAtThat/Glyph3D.xcodeproj

      > Exception: didn't find classname for 'isa' key

      Although this could also just be that the version of Xcode I have installed is too old. I haven't used Xcode in a while so I haven't updated it for a while either.

      My currently installed version of Xcode is Xcode version 15.0.

      • tikimcfee2 天前
        Hey there!

        In reverse order, I'd say yes, you may want to try at least Xcode 15.4, and preferably 16.0. I've also had issues with older versions of Xcode on the M series, and have had to update begrudgingly across versions over the years.

        And, to the 'safe area' issue you're mentioning, I do appreciate the comment. I have a pretty rough version of windowing in the app, and it partially relies on a a full view layer. Ideally I'd take advantage of said safe area, but I'd need to clean up a few things first.

        Now that I'm being reminded that this is a first-time user experience, I'll put this on the list!

  • I could see this working really well with call flow graphs for reverse engineering or disassemblers like IDA or Ghidra see https://clearbluejar.github.io/assets/img/2023-01-22-callgra...

    Can this work with visionos?

    • tikimcfee2 天前
      > call flow graphs

      Stay tuned! This is a planned future I had temporarily removed while I worked on performance in the Metal engine, and is coming back very soon =) I'd love the opportunity to pick your brain about potential intermediate formats as well - e.g., if you have specific languages or datasets that you'd want to be able to operate on. I have some ideas about adding a tiny runtime scripting layer in this as well to operate on text JIT, instead of just laying it out.

      > Can this work with visionos?

      Theoretically yes, but I haven't done the work to composite the two rendering layers together. Like a lot of Metal, there is a massive amount of documentation that would certainly be helpful if written, but currently isn't. I do plan on this, however, in the future.

      • v1sea2 天前
        I built an interactive call graph interface targeting visionOS that I'll be open sourcing soon. My approach is to use the Language Server Protocol(LSP) as the itermediate format as many LSP implementations support the call hierarchy method. You can then proxy the commands to the headset for rendering. Using the LSP is nice because you can integrate the controls into small editor plugins and get live code analysis.

        Most code analysis programs fail in some combination of editor integration, language support, iteration speed, or interactivity.

        One of the big issues with visionOS right now is the divide between a full immersive space and a volume. There is a dev option to enable the virtual display to work in an immersive space, but normally a full metal rendered scene will hide your mac monitor. The volume requires RealityKit and provides no hand tracking data. My approach is to run the graph fruchterman-reingold spring embedder in a metal compute kernel updating a LowLevelMesh in a volume. The biggest limit I've faced is around 1000 input targets (graph nodes) is the most the RealitKyt renderer can support before the frame rate dips under 90Hz.

        Good luck if you attempt visionOS, it is nearly undocumented.

        • tikimcfee2 天前
          Your implementation sounds great, and I'm looking forward to seeing it. Yes, the SDK is extremely obtuse and really not that easy to play with. I kinda was stubborn and refused to pick up the new rendering tool after just having learned SceneKit and then Metal, so I'm sure it's gotten at least a little easier, but the tools that Metal exposes just don't have analogues from what I can tell.

          Keep me on your short list for beta users when the time comes =)

        • jimmySixDOF1 天前
          I thought Unity's Polyspatial was supposed to make this easier but definitely interested to see what you come up with..... AVP is a but underutilized by apps.
  • dlivingston2 天前
    I tried this out on an iPhone 15 Pro.

    Some notes:

    - Opening in portrait mode, the Window Controls title bar is obscured by the Dynamic Island. I cannot move this window without going to landscape orientation.

    - The "Open Folder" button on the Files window doesn't work. File access is never requested.

    - The App Tools window looks funky in landscape mode. The tab bar is only 50% filled, vertically.

    - Windows appear in inconvenient locations, offscreen, partially obscured, or have strange sizing. I found myself rotating between portrait and landscape mode frequently just to do basic UI interactions.

    - Global Search completely obscures other windows, and its title bar is offscreen. This breaks the app. I have to force-close to get it back into a working state.

    You should probably pull iPhone support until this is tested more thoroughly. I imagine similar issues occur on iPad. Not to be harsh, but it's literally unusable on iPhone in its current state.

    • tikimcfee2 天前
      Thanks for the input!

      - Yep, it's annoying to have the current window controls overlayed by the safe area. You can still move it around if you're careful enough, but yes, it's not great.

      - The mobile app expects you to download from Github directly for now, because yes, there's no direct file import yet. Not a hard add, but just needs a bit of a different pipe to either copy or access the out of sandbox files.

      - Yes, the mobile windowing controls are meant more to allow access to all the demos for now, not for a pretty UI. It's not terrible on iPad since you can use the pencil for more accuracy, but I did have a plan to put the tabbed and sidebar controls back in place for mobile at some point.

      - Sorry about the search; if the 'reset' control doesn't do the trick, you might be out of luck for this version. The same fix I need to put in place to work within the current screen's safe area is the same that would keep these windows within viewport. And, ideally, to include a similar 'IDE' view as the desktop to avoid these multi window cases that are inconvenient or broken.

      I'll be focusing on a few changes related specifically to mobile UI in the next few builds of this. Touch, for example, has no way to rotate along the y-axis, and there's no way to 'hover' for bookmarking on tap.

  • visekr2 天前
    This is really cool - I've had a halfbaked idea of a visual code editor for a while and this really solidifies my concept. if anyone wants to work together on it lmk : )
    • tikimcfee2 天前
      Glad you think so! =)

      I'd love to chat with ya. Do you see my email? Feel free to email me! If not, reply here with whatever you'd prefer to chat with and I'll get back to you.

  • d-lisp2 天前
    I had the idea last year to create some kind of lsp/uml chimera that would render codebases in 3D so you could navigate them wearing a vr device. You would edit files from there, create files maybe classes, draw inheritance schemes, meet with other persons, explain stuff; and that would be a somewhat fun way of welcoming interns.
    • tikimcfee2 天前
      This is exactly the use case I'm aiming for! I've done a lot of mentoring, a big part of getting familiar with a code base is building one's mental map for where things are in any kind of relationship to what things are doing. In upcoming versions, I'll be restoring the 'state sharing' control that lets multiple devices share their camera orientations and bookmakers between users in a local P2P state, and maybe eventually some remote server backed way.
  • LoganDark2 天前
    This style of visualization reminds me of "VIRUS: The Game": https://www.youtube.com/watch?v=iIgpWGVvfjA&list=PLi_KYBWS_E...
    • tikimcfee2 天前
      Ha! You're kinda right and I like it. I guess thinking of literal files floating in space isn't a unique human experience eh? Good to know I've got someone like me out there at least =)
      • LoganDark2 天前
        I've even done one better: this but in virtual reality :)

        I also don't identify human, so I suppose that isn't even a necessarily human experience either~

        • tikimcfee2 天前
          Well then I have to say well done, haha. I had an old version in SceneKit / ARKit that worked in AR, but performance was limited based on my implementation and I could only get a few hundred files on screen on at once before you started losing too many frames.

          I wish you luck in your endeavors, and maybe one day we can chat about your experience, and what you've done =)

          • LoganDark2 天前
            Any day! I love learning and sharing :)

            And I didn't make anything that automatically populates files in VR, but I did use a program (XSOverlay) that lets me use 2D desktop windows in VR, and I would open a crap ton of files at once and have them positioned all around me (along with the directory listings they came from).