Thefork Fast Scraper Per Result avatar
Thefork Fast Scraper Per Result

Under maintenance

Pricing

$5.00 / 1,000 results

Go to Store
Thefork Fast Scraper Per Result

Thefork Fast Scraper Per Result

Under maintenance

Developed by

Maksym Bohomolov

Maksym Bohomolov

Maintained by Community

Scrape TheFork.com quickly and easily! Skip bloated browser tools. This scraper extracts restaurant data in a flash, no heavy lifting is needed. Scrape and monitor data with ease, all without Puppeteer or Playwright. ⚡️

5.0 (1)

Pricing

$5.00 / 1,000 results

7

Total users

31

Monthly users

6

Runs succeeded

>99%

Issues response

2 hours

Last modified

2 days ago

RE

Request to https://api.thefork.com/graphql failed and reached maximum retries

Open

Rebyu opened this issue
2 days ago

Input: { "getReservation": false, "getReview": true, "proxySettings": { "useApifyProxy": true, "apifyProxyGroups": [ "RESIDENTIAL" ] }, "reviewLimit": 100, "searchLimit": 1, "stopAtDate": "2025-07-27", "urls": [ "https://www.thefork.it/restaurant/cantina-di-biffi-r484917" ], "partySize": 1 }

Error log:

2025-08-01T10:52:08.945Z ACTOR: Pulling Docker image of build FmMkokZJePhPIeKgG from registry. 2025-08-01T10:52:10.697Z ACTOR: Creating Docker container. 2025-08-01T10:52:11.805Z ACTOR: Starting Docker container. 2025-08-01T10:52:15.842Z [apify._configuration] WARN Actor is running on the Apify platform, disable_browser_sandbox was changed to True. 2025-08-01T10:52:16.224Z [apify] INFO Initializing Actor... 2025-08-01T10:52:16.227Z [apify] INFO System info ({"apify_sdk_version": "2.6.0", "apify_client_version": "1.9.2", "crawlee_version": "0.6.11", "python_version": "3.13.5", "os": "linux"}) 2025-08-01T10:52:16.555Z [apify] INFO Creating dataset for reviews... 2025-08-01T10:52:16.596Z [apify] INFO Dataset for reviews created: VwIgycrShGvOLKSoh with name thefork-reviews-2025-08-01-10-52-16 2025-08-01T10:52:16.601Z [HttpCrawler] INFO Start scraping for 1 searches 2025-08-01T10:52:16.607Z [HttpCrawler] INFO Current request statistics: 2025-08-01T10:52:16.608Z ┌───────────────────────────────┬──────────┐ 2025-08-01T10:52:16.609Z │ requests_finished │ 0 │ 2025-08-01T10:52:16.610Z │ requests_failed │ 0 │ 2025-08-01T10:52:16.611Z │ retry_histogram │ [0] │ 2025-08-01T10:52:16.611Z │ request_avg_failed_duration │ None │ 2025-08-01T10:52:16.612Z │ request_avg_finished_duration │ None │ 2025-08-01T10:52:16.613Z │ requests_finished_per_minute │ 0 │ 2025-08-01T10:52:16.613Z │ requests_failed_per_minute │ 0 │ 2025-08-01T10:52:16.614Z │ request_total_duration │ 0.0 │ 2025-08-01T10:52:16.615Z │ requests_total │ 0 │ 2025-08-01T10:52:16.616Z │ crawler_runtime │ 0.001346 │ 2025-08-01T10:52:16.616Z └───────────────────────────────┴──────────┘ 2025-08-01T10:52:16.617Z [crawlee._autoscaling.autoscaled_pool] INFO current_concurrency = 0; desired_concurrency = 2; cpu = 0; mem = 0; event_loop = 0.0; client_info = 0.0 2025-08-01T10:52:18.586Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:20.079Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:21.637Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:23.083Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:25.464Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:27.292Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:28.946Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:31.512Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:32.690Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:34.643Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:36.619Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:38.209Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:41.131Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:43.169Z [HttpCrawler] WARN Encountered a session error, rotating session and retrying 2025-08-01T10:52:45.415Z [HttpCrawler] ERROR Request to https://api.thefork.com/graphql failed and reached maximum retries 2025-08-01T10:52:45.416Z Traceback (most recent call last): 2025-08-01T10:52:45.417Z File "/app/.venv/lib/python3.13/site-packages/crawlee/crawlers/_basic/_basic_crawler.py", line 1328, in __run_task_function 2025-08-01T10:52:45.417Z await self._run_request_handler(context=context) 2025-08-01T10:52:45.418Z File "/app/.venv/lib/python3.13/site-packages/crawlee/crawlers/_basic/_basic_crawler.py", line 1422, in _run_request_handler 2025-08-01T10:52:45.419Z await wait_for( 2025-08-01T10:52:45.420Z ...<5 lines>... 2025-08-01T10:52:45.420Z ) 2025-08-01T10:52:45.421Z File "/app/.venv/lib/python3.13/site-packages/crawlee/_utils/wait.py", line 37, in wait_for 2025-08-01T10:52:45.422Z return await asyncio.wait_for(operation(), timeout.total_seconds()) 2025-08-01T10:52:45.423Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-08-01T10:52:45.424Z File "/usr/local/lib/python3.13/asyncio/tasks.py", line 507, in wait_for 2025-08-01T10:52:45.425Z return await fut 2025-08-01T10:52:45.426Z ^^^^^^^^^ 2025-08-01T10:52:45.427Z File "/app/.venv/lib/python3.13/site-packages/crawlee/crawlers/_basic/_context_pipeline.py", line 68, in call 2025-08-01T10:52:45.427Z result = await middleware_instance.anext() 2025-08-01T10:52:45.428Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-08-01T10:52:45.429Z File "/app/.venv/lib/python3.13/site-packages/crawlee/crawlers/_abstract_http/_abstract_http_crawler.py", line 242, in _handle_status_code_response 2025-08-01T10:52:45.429Z self._raise_for_session_blocked_status_code(context.session, status_code) 2025-08-01T10:52:45.430Z ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-08-01T10:52:45.431Z File "/app/.venv/lib/python3.13/site-packages/crawlee/crawlers/_basic/_basic_crawler.py", line 1466, in _raise_for_session_blocked_status_code 2025-08-01T10:52:45.432Z raise SessionError(f'Assuming the session is blocked based on HTTP status code {status_code}') 2025-08-01T10:52:45.432Z crawlee.errors.SessionError: Assuming the session is blocked based on HTTP status code 403 2025-08-01T10:52:45.433Z [crawlee._autoscaling.autoscaled_pool] INFO Waiting for remaining tasks to finish 2025-08-01T10:52:45.523Z [HttpCrawler] INFO Error analysis: total_errors=1 unique_errors=1 2025-08-01T10:52:45.524Z [HttpCrawler] INFO Final request statistics: 2025-08-01T10:52:45.525Z ┌────────────────────────────┬────────────────────────────┐ 2025-08-01T10:52:45.526Z │ requests_finished │ 0 │ 2025-08-01T10:52:45.527Z │ requests_failed │ 1 │ 2025-08-01T10:52:45.527Z │ retry_histogram │ [0, 0, 0, 0, 0, 0, 0, 0... │ 2025-08-01T10:52:45.528Z │ request_avg_failed_dura... │ 2.244373 │ 2025-08-01T10:52:45.529Z │ request_avg_finished_du... │ None │ 2025-08-01T10:52:45.530Z │ requests_finished_per_m... │ 0 │ 2025-08-01T10:52:45.530Z │ requests_failed_per_minute │ 2 │ 2025-08-01T10:52:45.531Z │ request_total_duration │ 2.244373 │ 2025-08-01T10:52:45.532Z │ requests_total │ 1 │ 2025-08-01T10:52:45.533Z │ crawler_runtime │ 28.921218 │ 2025-08-01T10:52:45.533Z └────────────────────────────┴────────────────────────────┘ 2025-08-01T10:52:45.534Z [apify] INFO Scraper has been successfully completed. 2025-08-01T10:52:45.535Z [apify] INFO Exiting Actor ({"exit_code": 0})

mantisus avatar

Yes, I see. Thank you.

The site has been updated, I am switching Actor to maintenance mode.