V2EX = way to explore
V2EX 是一个关于分享和探索的地方
现在注册
已注册用户请  登录
推荐学习书目
Learn Python the Hard Way
Python Sites
PyPI - Python Package Index
http://diveintopython.org/toc/index.html
Pocoo
值得关注的项目
PyPy
Celery
Jinja2
Read the Docs
gevent
pyenv
virtualenv
Stackless Python
Beautiful Soup
结巴中文分词
Green Unicorn
Sentry
Shovel
Pyflakes
pytest
Python 编程
pep8 Checker
Styles
PEP 8
Google Python Style Guide
Code Style from The Hitchhiker's Guide
mdb
V2EX  ›  Python

关于运行 Lightrag 官方示例报错的问题

  •  
  •   mdb · 18 小时 16 分钟前 · 311 次点击
    Lightrag 是个 rag 框架,想用它来进行 AI 的知识库问答,从 github 上下载代码完后,历经千辛终于能把环境装好,但是运行的时候感觉快到成功的时候报错了,有了解的大佬能看出是什么原因吗
    本地 ollama 的模型是 qwen2.5:7b 和向量模型 nomic-embed-text ,做插入操作时报了如下错误
    关键错误:IndexError: index 0 is out of bounds for axis 0 with size 0

    -------------原始信息---------------
    INFO:httpx:HTTP Request: POST http://localhost:11434/api/embeddings "HTTP/1.1 200 OK"
    Generating embeddings: 100%|██████████| 2/2 [00:22<00:00, 11.12s/batch]
    INFO:lightrag:Writing graph with 0 nodes, 0 edges
    Traceback (most recent call last):
    File "D:\download\ff\LightRAG-main\examples\lightrag_ollama_demo.py", line 31, in <module>
    rag.insert(f.read())
    File "D:\download\ff\LightRAG-main\lightrag\lightrag.py", line 238, in insert
    return loop.run_until_complete(self.ainsert(string_or_strings))
    File "E:\Python310\lib\asyncio\base_events.py", line 649, in run_until_complete
    return future.result()
    File "D:\download\ff\LightRAG-main\lightrag\lightrag.py", line 286, in ainsert
    await self.chunks_vdb.upsert(inserting_chunks)
    File "D:\download\ff\LightRAG-main\lightrag\storage.py", line 112, in upsert
    results = self._client.upsert(datas=list_data)
    File "D:\download\ff\LightRAG-main\venv\lib\site-packages\nano_vectordb\dbs.py", line 100, in upsert
    self.__storage["matrix"][i] = update_d[f_VECTOR].astype(Float)
    IndexError: index 0 is out of bounds for axis 0 with size 0

    Process finished with exit code 1

    -------------------------代码中配置如下-----------------
    rag = LightRAG(
    working_dir=WORKING_DIR,
    llm_model_func=ollama_model_complete,
    llm_model_name="qwen2.5:7b",
    llm_model_max_async=4,
    llm_model_max_token_size=32768,
    llm_model_kwargs={"host": "http://localhost:11434", "options": {"num_ctx": 32768}},
    embedding_func=EmbeddingFunc(
    embedding_dim=768,
    max_token_size=8192,
    func=lambda texts: ollama_embedding(
    texts, embed_model="nomic-embed-text", host="http://localhost:11434"
    ),
    ),
    )

    with open("./book.txt", "r", encoding="utf-8") as f:
    rag.insert(f.read()) --------------------这行报错


    这是什么原因,是哪里配得不对吗,还是代码有问题
    1 条回复
    Xs0ul
        1
    Xs0ul  
       15 小时 41 分钟前
    关于   ·   帮助文档   ·   博客   ·   API   ·   FAQ   ·   实用小工具   ·   4190 人在线   最高记录 6679   ·     Select Language
    创意工作者们的社区
    World is powered by solitude
    VERSION: 3.9.8.5 · 22ms · UTC 10:13 · PVG 18:13 · LAX 02:13 · JFK 05:13
    Developed with CodeLauncher
    ♥ Do have faith in what you're doing.