• Welcome to the Chevereto user community!

    Here users from all over the world gather around to learn the latest about Chevereto and contribute with ideas to improve the software.

    Please keep in mind:

  • Chevereto Support CLST

    Support response

    Support checklist

    • Got a Something went wrong message? Read this guide and provide the actual error. Do not skip this.
    • Confirm that the server meets the System Requirements
    • Check for any available Hotfix - your issue could be already reported/fixed
    • Read documentation - It will be required to Debug and understand Errors for a faster support response

Cron Job is not working in 4.1.4

Version
4.1.4
Website URL
https://www.sharemyimage.com/
PHP version
8.1.28
Database driver
MySQL
Database version
8.0.36
Web browser
chrome

sharemyimage

Chevereto Member
Cron job is not working in 4.1.4 when I am trying to manully run it from the dashboard its says

Request denied​

You either don't have permission to access this page or the link has expired.

Is anyone else encountering a similar issue?

What could be the SQL query to clear the queues to remove the images?
 
Cron job is not working in 4.1.4 when I am trying to manully run it from the dashboard its says

Request denied​

You either don't have permission to access this page or the link has expired.

Is anyone else encountering a similar issue?

What could be the SQL query to clear the queues to remove the images?
I updated directly from 4.1.1 to 4.1.4, is this the main ptoblem of not working cron job.

Do let me know if any solution of this query....

What could be the SQL query to clear the queues to remove the images? Or directly delete them from the queues table.
 
After logging in to your server as root, type crontab -e and paste the code there, at the end. Then save it. After that it needs to work.

Code:
* * * * * www-data php /home/USERNAME/public_html/app/bin/legacy -C cron

Match these statements to those on your server.
  • www-data
  • /home/USERNAME/public_html
Important note, in systems such as Ubuntu, each user has their own permissions. Therefore, you need to log in with the user of the site (su username), not with root, and set the crontab.
 
HTTP triggered cron job requires the auth token, this is to prevent csrf attacks. If you get a request denied there is because the auth token is expiring or the url doesn't include the auth_token query string.

You can't fave the http run cron URL, because it needs the auth token which changes every time you login.
 
I am currently using version 4.1.4, and the cron job was functioning properly. We deleted approximately 100,000 images, but when I checked the queues table, it showed there were still 300,000 (including thumbnail) images pending deletion. However, today I noticed that the cron job seems to be stuck, indicating that there are still 132,000 images left to process.

Please note that only the manual cron job is encountering errors, while the automatic one is functioning properly.

Currently, I am manually executing the cron job, but it is returning an error: Request denied. Please see the attached screenshot.

What can be the problem.

1739445776866.png

This is the queues table, which is stuck on 132836

1739448655847.png
 
Last edited:
I am manually executing the cron job, but it is returning an error: Request denied. Please see the attached screenshot.
Did you bookmarked the "Run CRON" link? That link can't be saved, it requires the session access token to process the request. The manual cron is for last resort, you shouldn't rely on that for production.

indicating that there are still 132,000 images left to process.
Chevereto tries a few times then it stops trying, you will need to alter queues table for queue_attempts=0.
 
Did you bookmarked the "Run CRON" link? That link can't be saved, it requires the session access token to process the request. The manual cron is for last resort, you shouldn't rely on that for production.


Chevereto tries a few times then it stops trying, you will need to alter queues table for queue_attempts=0.
No, i did not bookmarked the link, just clicked on the dashboad link. 1739462183624.png

if I delete the queues, will it also delete the image, or will it remain in the bucket?
 

Attachments

  • 1739462243548.png
    1739462243548.png
    14.4 KB · Views: 2
The Chevereto website is currently inaccessible in India.
I'm sorry but India is blocked in some endpoints due to high spam. It shouldn't affect the API.

The request denied error seems session token related, when the cron fails it throws another error.

You need to run the cron directly in the CLI to try to understand where it went wrong.
 
Back
Top