[go: up one dir, main page]

  • Tar_Alcaran@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    11
    1 year ago

    The demo looks pretty impressive, but it鈥檚 a prerecorded demo we know nothing about. SO many AI companies have been lying about their benchmarks.

  • Karkitoo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    1 year ago

    Looks impressive and it鈥檚 truly open-source.

    However, I see it requires CUDA. Could it run anyway:

    1. Without this?
    2. With AMD hardware?
    3. On mobile (as the model is only 1B) ?
    • thickertoofan@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      2
      1 year ago

      I think the bigger bottleneck is SLAM, running that is intensive, it wont directly run on video, and SLAM is tough i guess, reading the repo doesn鈥檛 give any clues of it being able to run on CPU inference.