Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipe.fit() freezes and don't return anything #91

Open
prakhar251998 opened this issue Aug 7, 2023 · 13 comments
Open

pipe.fit() freezes and don't return anything #91

prakhar251998 opened this issue Aug 7, 2023 · 13 comments

Comments

@prakhar251998
Copy link

prakhar251998 commented Aug 7, 2023

Hi many times while running this model in pycharm the model just freezes and don;t return the output.
It works sometimes after restarting the kernel twice or thrice.
Sometimes doesn't work at all.
Please look into this
image

@prakhar251998 prakhar251998 changed the title pipe.fit() freezes and don't return anaything pipe.fit() freezes and don't return anything Aug 8, 2023
@prakhar251998
Copy link
Author

On further investigation it seems intiliazing the model pipeline can happen only once per kernel as it opens the llm_response session.

If we want to run this api again with different key ,different config(domain and labels), we need to restart the kernel.

@monk1337
Copy link
Contributor

Yes, because changing key and domain etc would initiate the prompter, model etc again. for a single model you can initialize the things once and then call the pipe.fit() multiple times on any number of samples.

@monk1337
Copy link
Contributor

Hi many times while running this model in pycharm the model just freezes and don;t return the output. It works sometimes after restarting the kernel twice or thrice. Sometimes doesn't work at all. Please look into this image

Can you provide more details on this? your sentence, template etc a minimal code to reproduce the error

@kansalaman
Copy link

Was facing the same issue, setting structured_output=False and parsing the output myself seems to bypass the issue.

@roperi
Copy link

roperi commented Apr 7, 2024

Hi many times while running this model in pycharm the model just freezes and don;t return the output. It works sometimes after restarting the kernel twice or thrice. Sometimes doesn't work at all. Please look into this image

Can you provide more details on this? your sentence, template etc a minimal code to reproduce the error

Facing same issue. Adding structured_output=False in pipe.fit() as @kansalaman said, unfreezes the frozen state but it fails with :

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/uvicorn/middleware/message_logger.py", line 84, in __call__
    raise exc from None
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/uvicorn/middleware/message_logger.py", line 80, in __call__
    await self.app(scope, inner_receive, inner_send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/routing.py", line 296, in app
    content = await serialize_response(
              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/routing.py", line 180, in serialize_response
    return jsonable_encoder(response_content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 301, in jsonable_encoder
    jsonable_encoder(
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 287, in jsonable_encoder
    encoded_value = jsonable_encoder(
                    ^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 287, in jsonable_encoder
    encoded_value = jsonable_encoder(
                    ^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 331, in jsonable_encoder
    return jsonable_encoder(
           ^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 331, in jsonable_encoder
    return jsonable_encoder(
           ^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 287, in jsonable_encoder
    encoded_value = jsonable_encoder(
                    ^^^^^^^^^^^^^^^^^
  File "/home/username/.virtualenvs/promptify/lib/python3.11/site-packages/fastapi/encoders.py", line 330, in jsonable_encoder
    raise ValueError(errors) from e
ValueError: [TypeError("'builtin_function_or_method' object is not iterable"), TypeError('vars() argument must have __dict__ attribute')]
TRACE:    127.0.0.1:35808 - HTTP connection lost

@roperi
Copy link

roperi commented Apr 7, 2024

In my case I'm trying to run the pipeline via Fastapi. But it freezes as reported above. Same thing happens when trying with curl.

@roperi
Copy link

roperi commented Apr 7, 2024

Upon further inspection i found that it stalls here:

   # promptify/pipelines/__init__.py

    [...]

    def _get_output_from_cache_or_model(self, template):
        output = None

        if self.cache_prompt:
            output = self.prompt_cache.get(template)
        if output is None:
            try:
                response = self.model.execute_with_retry(prompt=template)
            except Exception as e:
                print(f"Error in model execution: {e}")
                return None
            if self.structured_output:
                output = self.model.model_output(
                    response, json_depth_limit=self.json_depth_limit
                )
    [...]

Especifically in the following part:

output = self.model.model_output(
    response, json_depth_limit=self.json_depth_limit
)

It just stalls there.

@roperi
Copy link

roperi commented Apr 8, 2024

Found the culprit in promptify/parser/parser.py.

This function goes into a infinite loop in the try/Except block. I only noticed it because i printed the Exception (original code doesn't print the exception):

    def get_possible_completions(
        self, json_str: str, json_depth_limit: int = 5
    ) -> Union[Dict[str, Any], List[Any]]:
        """
        Returns a list of possible completions for a JSON object string.

        Parameters
        ----------
        json_str : str
            The JSON object string
        json_depth_limit : int, optional
            The maximum length of the completion strings to try (default is 5)

        Returns
        -------
        Union[Dict[str, Any], List[Any]]
            If the completion strings are objects, returns a dictionary with 'completion' and 'suggestions' keys.
            If the completion strings are arrays, returns a list of suggested completions.
        """
        candidate_marks = ["}", "]"]
        if "[" not in json_str:
            candidate_marks.remove("]")
        if "{" not in json_str:
            candidate_marks.remove("}")

        # specify the mark should end with
        should_end_mark = "]" if json_str.strip()[0] == "[" else "}"
        completions = []
        for completion_str in self.get_combinations(
            candidate_marks, json_depth_limit, should_end_mark=should_end_mark
        ):
            try:
                completed_obj = self.complete_json_object(json_str, completion_str)
                completions.append(completed_obj)
            except Exception as e:
                print(e)
                pass
        return self.find_max_length(completions)

Results in:

Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON
Couldn't fix JSON

@roperi
Copy link

roperi commented Apr 8, 2024

Funny thing there are no problems when one runs it in python. Problem happens as soon as you try doing it via CURL, FastAPI or Flask. It just enter this infinite loop.

@roperi
Copy link

roperi commented Apr 8, 2024

Funny thing there are no problems when one runs it in python. Problem happens as soon as you try doing it via CURL, FastAPI or Flask. It just enter this infinite loop.

It just started to fail when run from python shell too. The parser is completely unrelieable when it comes to parsing the json. I'd recommend to update the code to use Pydantic with Instructor.

@Fan4ik20
Copy link

Fan4ik20 commented Apr 23, 2024

@roperi Hi, did you solve this problem?

I have neither the structured_output=False attribute nor the global Pipeline pipeline working

Upd. it stucks bot in python use and fastapi

Upd2: Fixed. structured_output it's an Pipeline attribute :)

@roperi
Copy link

roperi commented Apr 27, 2024

@roperi Hi, did you solve this problem?

I have neither the structured_output=False attribute nor the global Pipeline pipeline working

Upd. it stucks bot in python use and fastapi

Upd2: Fixed. structured_output it's an Pipeline attribute :)

@Fan4ik20 Unfortunately I didn't. The parser kept failing at parsing the json. I decided to not use Promptify and move on.

@Fan4ik20
Copy link

@roperi Yes, same for me, so i parsed received JSON by myself. BUT, for some reason we received JSON with extra brackets, so i had to remove them

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants