Bug #18203
openMissing report with directive Scheduled Job
Description
I have
A general setup period of 6hours with 5hours max delay.
and
a scheduled job directive configured liked this (for all my 20 nodes):
Lowest time the command should be run at 1
Highest time the command should be run at 4
Consider the job failed after (minutes) 120
Return codes considered as a success 0
Return codes considered as a repair 1
Return codes considered as an error 2
Regularly, some nodes (not the same ones, most of the time only 1, but can be more) have a compliance message 'missing report'
Updated by P C about 4 years ago
- Subject changed from Missing report with directive Scheduled Job when Opening time shorter then period to Missing report with directive Scheduled Job
- Severity changed from Minor - inconvenience | misleading | easy workaround to Major - prevents use of part of Rudder | no simple workaround
I have
A general setup period of 2 hours with 1 hour max delay.
and
2 scheduled job directive configured liked this (for all my 20 nodes):
Job1
Lowest time the command should be run at 1
Highest time the command should be run at 4
Consider the job failed after (minutes) 240
Return codes considered as a success 0
Return codes considered as a repair 1
Return codes considered as an error 2
Job2
Lowest time the command should be run at 0
Highest time the command should be run at 23
Consider the job failed after (minutes) 240
Return codes considered as a success 0
Return codes considered as a repair 1
Return codes considered as an error 2
Regularly, some nodes (not the same ones, most of the time only 1, but can be more) have a compliance message 'missing report', sometime for one job, sometime for both.
Updated by Nicolas CHARLES about 4 years ago
Some more details:
On a node, with agent schedule every 2 hours, command was executed
2020-09-30T00:50:10+00:00 2020-10-01T00:50:33+00:00 2020-10-02T00:50:03+00:00 2020-10-03T00:51:06+00:00 2020-10-05T00:50:29+00:00
so no run on the 4th and 6th (validated by file touched at beginning of action)
When a job scheduler run was expected, log only says
2020-10-05T00:07:54+00:00 R: @@jobScheduler@@log_info@@xxxxxxxxx-xxxxxxxx@@yyyy-yyyyy@@0@@None@@job_to_run_zzzz_zzzzz@@2020-10-05 00:07:46+00:00##nodeId@#Scheduling job1_to_run_zzzz_zzzzz was correct 2020-10-05T00:07:54+00:00 R: @@jobScheduler@@log_info@@xxxxxxxxx-xxxxxxxx@@yyyy-yyyyy@@0@@None@@job_to_run_zzzz_zzzzz@@2020-10-05 00:07:46+00:00##nodeId@#Scheduling Scheduling job2_to_run_zzzz_zzzzz was correct 2020-10-05T00:07:54+00:00 R: @@jobScheduler@@log_info@@xxxxxxxxx-xxxxxxxx@@yyyy-yyyyy@@0@@Job@@command1@@2020-10-05 00:07:46+00:00##nodeId@#The command will be run at a random time after 00:00 on this node 2020-10-05T00:07:54+00:00 R: @@jobScheduler@@log_info@@xxxxxxxxx-xxxxxxxx@@yyyy-yyyyy@@0@@Job@@command2@@2020-10-05 00:07:46+00:00##nodeId@#The command will be run at a random time after 00:00 on this node
There are 2 jobs, it might be related
Updated by Nicolas CHARLES about 4 years ago
- Target version set to 6.1.7
Nothing looks weird on the code side, but it could be related to https://github.com/cfengine/core/pull/4257
Updated by Nicolas CHARLES about 4 years ago
- Related to Bug #18732: backport fix on background command execution on agent added
Updated by Vincent MEMBRÉ about 4 years ago
- Target version changed from 6.1.7 to 6.1.8
Updated by Vincent MEMBRÉ almost 4 years ago
- Target version changed from 6.1.8 to 6.1.9
Updated by Vincent MEMBRÉ almost 4 years ago
- Target version changed from 6.1.9 to 6.1.10
Updated by Nicolas CHARLES almost 4 years ago
lock condition is invalid, it should be !job_scheduler_lock_${iterator}_&RudderUniqueID&
I don't think it would cause the problem here, but it is wrong
Updated by Vincent MEMBRÉ almost 4 years ago
- Target version changed from 6.1.10 to 6.1.11
Updated by Vincent MEMBRÉ almost 4 years ago
- Target version changed from 6.1.11 to 6.1.12
Updated by Vincent MEMBRÉ over 3 years ago
- Target version changed from 6.1.12 to 6.1.13
Updated by Vincent MEMBRÉ over 3 years ago
- Target version changed from 6.1.13 to 6.1.14
Updated by Vincent MEMBRÉ over 3 years ago
- Target version changed from 6.1.14 to 6.1.15
Updated by Vincent MEMBRÉ over 3 years ago
- Target version changed from 6.1.15 to 6.1.16
Updated by Vincent MEMBRÉ over 3 years ago
- Target version changed from 6.1.16 to 6.1.17
Updated by Vincent MEMBRÉ about 3 years ago
- Target version changed from 6.1.17 to 6.1.18
Updated by Vincent MEMBRÉ about 3 years ago
- Target version changed from 6.1.18 to 6.1.19
Updated by Alexis Mousset almost 3 years ago
- Severity changed from Major - prevents use of part of Rudder | no simple workaround to Minor - inconvenience | misleading | easy workaround
- User visibility set to Operational - other Techniques | Rudder settings | Plugins
- Priority changed from 0 to 27
Updated by Vincent MEMBRÉ over 2 years ago
- Target version changed from 6.1.19 to 6.1.20
Updated by Vincent MEMBRÉ over 2 years ago
- Target version changed from 6.1.20 to 6.1.21
Updated by Vincent MEMBRÉ over 2 years ago
- Target version changed from 6.1.21 to old 6.1 issues to relocate
Updated by Alexis Mousset almost 2 years ago
- Category changed from Agent to Techniques
- Priority changed from 27 to 0
Updated by Alexis Mousset 8 months ago
- Target version changed from old 6.1 issues to relocate to 7.3.15
Updated by Vincent MEMBRÉ 7 months ago
- Target version changed from 7.3.15 to 7.3.16
Updated by Vincent MEMBRÉ 6 months ago
- Target version changed from 7.3.16 to 7.3.17