Skip to content

Big bug with CROW_BP_CATCHALL_ROUTE loses the request #729

Open
@Robbivan

Description

@Robbivan

There was a problem with getting a request in catchall_route
According to the library, such a lambda capture configuration is normal

CROW_BP_CATCHALL_ROUTE(m_bp_inner_example)
            ([&](const crow::request &req,
                 crow::response &res){...}

However, the request comes empty. A read error " from read(1)" is displayed from http_connection.h

void do_read()
{
...
          if (error_while_reading)
                  {
                      cancel_deadline_timer();
                      parser_.done();
                      adaptor_.shutdown_read();
                      adaptor_.close();
                      is_reading = false;
                      CROW_LOG_DEBUG << this << " from read(1) with description: \"" << http_errno_description(static_cast<http_errno>(parser_.http_errno)) << '\"';
                      check_destroy();
                  }
}

With a custom Crow Debug it looks like this
(2023-12-08 12:44:22) [DEBUG ] task_timer scheduled: 0x7f88923fe5f0 1
(2023-12-08 12:44:22) [DEBUG ] 0x5582c21ae990 timer added: 0x7f88923fe5f0 1
(2023-12-08 12:44:22) [DEBUG ] clear
(2023-12-08 12:44:22) [DEBUG ] length280
(2023-12-08 12:44:22) [DEBUG ] Cannot match rules /bp_prefix/bp2/wq.
(2023-12-08 12:44:22) [DEBUG ] length0
(2023-12-08 12:44:22) [INFO ] Response: 0x5582c21ae990 /bp_prefix/bp2/wq 200 0
(2023-12-08 12:44:22) [DEBUG ] 0x5582c21ae990 timer cancelled: 0x7f88923fe5f0 1
(2023-12-08 12:44:22) [DEBUG ] task_timer cancelled: 0x7f88923fe5f0 1
(2023-12-08 12:44:22) [DEBUG ] length0
(2023-12-08 12:44:22) [DEBUG ] 0x5582c21ae990 from read(1) with description: "stream ended at an unexpected time"
(2023-12-08 12:44:22) [DEBUG ] clear

As I understand it, initially the request comes normally with a length greater than 0, however, after a suitable BR_ROUTE is not found, the buffer with the request in parser.h is cleared,

from parser.h

 void clear()
        {
            url.clear();
            raw_url.clear();
            header_field.clear();
            header_value.clear();
            headers.clear();
            url_params.clear();
            body.clear();
            header_building_state = 0;
            qs_point = 0;
            http_major = 0;
            http_minor = 0;
            message_complete = false;
            state = CROW_NEW_MESSAGE();
            keep_alive = false;
            close_connection = false;
        }

and the url is saved and transferred to BP_CATCHALL. It is because of the buffer cleanup before sending to BP_CATCHALL that we do not receive our request, just a new empty one is created.

Please fix this bug! We are really waiting, there will be cool functionality

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions