Spring Boot Batch 6.x: Job Parameters are null in Reader despite using @StepScope

Summary

The core issue is a mismatch between the bean definition and Spring Batch’s scoped proxy requirements. The job parameter injection fails in the Wso2InstanceItemReader because the JobParams bean is not correctly configured as a step-scoped bean that receives the JobParameters context.

While the getInstanceId method in the configuration is annotated with @StepScope, it does not define a proper Spring Batch ItemReader or ItemProcessor. Consequently, Spring does not recognize it as a component that requires step-level lifecycle management or proxying to inject runtime JobParameters. The reader then attempts to inject this JobParams bean, which is either not proxied or not in scope correctly, resulting in null values.

Root Cause

The root cause is the incorrect definition of a step-scoped bean intended to hold job parameters.

  1. Invalid Bean Type: The configuration defines a bean getInstanceId returning a JobParams POJO. Spring Batch’s @StepScope is designed primarily for ItemReader, ItemProcessor, ItemWriter, and Tasklet beans. Defining a simple POJO holder this way prevents the necessary proxying mechanism that allows SpEL expressions like #{jobParameters[...]} to resolve dynamically.
  2. Lack of Proxying: Because JobParams is not a batch component, Spring does not automatically create a CGLIB proxy around it. Without this proxy, the #{jobParameters['instanceId']} SpEL expression is evaluated once at application startup (where no job parameters exist), rather than at step execution time.
  3. Scoping Failure: The reader injects the JobParams bean. Since the bean wasn’t proxied correctly, it retains the default value (null) assigned at startup.

Why This Happens in Real Systems

In real-world systems, this pattern of “parameter adaptation” is common but fragile. Developers often try to create a Parameter Object (like JobParams) to decouple the reader from the raw JobParameters map.

However, this abstraction breaks the step-level lifecycle. Spring Batch creates a new child ApplicationContext for every step execution. Beans required by that step must be defined within that context.

  • If you define a standard singleton bean, it has no access to the transient JobParameters.
  • If you define a @StepScope bean that isn’t a batch component, the framework may not register it correctly in the child context, or the JobParameters argument resolver doesn’t trigger because the bean isn’t part of the Step‘s ExecutionContext lifecycle.

Real-World Impact

  • Null Pointer Exceptions (NPEs): The most immediate impact. The reader likely constructs SQL queries or file paths based on jobParams.getInstanceId(), resulting in invalid queries or logic errors.
  • Job Restart Failure: If the job fails and attempts to restart, the JobParameters are essential for identifying the unique execution. If they are null, the job may try to reprocess data it has already completed or fail to locate the last safe commit point.
  • Hard Coded Logic: Without dynamic parameters, the batch job loses its ability to be parameterized (e.g., processing a specific date range or entity ID), forcing developers to hardcode values or restart the entire application context.

Example or Code

To fix this, we must move the parameter injection directly into the reader or create a valid step-scoped processor/reader to hold the data.

Here is the corrected Wso2InstanceItemReader implementation. We inject the instanceId directly via SpEL into the reader’s field.

@Component
@RequiredArgsConstructor
@StepScope
@Slf4j
public class Wso2InstanceItemReader implements ItemReader {

    private final Wso2InstanceDao wso2InstanceDao;
    private Iterator iterator;

    // Injection moved directly into the Reader
    @Value("#{jobParameters['instanceId']}")
    private String instanceId;

    @Override
    public @Nullable Wso2Instance read() {
        if (iterator == null) {
            // Now using the injected instanceId field
            log.info("Processing for Instance ID: {}", instanceId); 
            List data = wso2InstanceDao.getWso2Instances(instanceId);
            iterator = data.iterator();
        }
        if (iterator.hasNext()) {
            return iterator.next();
        }
        return null;
    }
}

Additionally, remove the getInstanceId bean definition from BatchConfig, as it is unnecessary and invalid.

How Senior Engineers Fix It

Senior engineers avoid creating “holder” beans for job parameters. They follow these principles:

  1. Inject Directly: Inject JobParameters values directly into the component that needs them (Reader, Processor, Writer) using @Value("#{jobParameters['key']}").
  2. Interface Injection: If multiple components need the parameters, inject the whole JobParameters object:
    @Value("#{jobParameters}")
    private JobParameters jobParameters;
  3. Lazy Evaluation: Ensure that any logic relying on these parameters is executed inside the read() or write() methods, not in the constructor or @PostConstruct, to ensure the proxy has the correct execution context.
  4. Clean Configuration: Keep the BatchConfig strictly for flow definition (Steps and Jobs). Put component logic in separate, annotated classes.

Why Juniors Miss It

Juniors often miss this because of the “Magic of Spring” misconception. They see @StepScope and SpEL used together and assume it works on any bean.

  • Abstraction Overkill: They try to be “clean” by creating a JobParams class to avoid passing strings around, not realizing Spring Batch requires specific interfaces to hook into the scope lifecycle.
  • Misunderstanding Proxies: They don’t grasp that @StepScope creates a proxy that intercepts calls. If the bean isn’t a standard Spring Batch component, the proxy might not be created, or the context lookup fails.
  • Copy-Paste Errors: They copy code from examples that show @StepScope on a Tasklet (which is valid) and try to apply it to a generic @Component or POJO (which is invalid).