Return single item ItemReader SpringBacth

I didnĀ“t find the answer to what I need to do, so I'll try to explain my problem with Spring Batch. Imagine that we have a Table in BBDD with 4 records to recover. I have a Job with one Step that takes <ObjectA, ObjectA>chunk(10) with chunk of 10.

I have this code in Reader:

@Component
public class InterReader implements ItemReader<ObjectA>{

public static final Log logger = LogFactory.getLog(InterReader.class);

@Autowired
Service service;

private List<ObjectA> list;

@Override
public ObjectAread() throws Exception {
    
    try {

        if (list!= null && !list.isEmpty()) {
            return list.remove(0);
        }
        else {
            list= service.getResullts();
            if (list!= null && !list.isEmpty()) {
                return list.remove(0);
            }
            else {
                return null;
            }
        }
    } catch (ServiceException serviceException) {
        logger.error(new StringBuffer("Error"), serviceException);
    }
    return null;
}
} 

In a service I recover a list of 10 results, in our case it is only 4 left, so service recovers 4 results. They passed to reader one by one and when is empty, because of chunk of 10, and is not complete, the query is executed again and get same 4 result, and so on until chunk is completed. So the reader (in our case with 4 results) receive duplicated items.

Can someone give me some good example of how I could/should recover a List to a Reader, but passing it one by one to processor. So then writer can receive a complete chunk limited List (in our case with 10 elements).

The solutions I tried, but hope is there is another example,

  1. In query when I recover my result "block" with an update query the results I recover with a select, so when the service is executed again it does not find any results and the processor is executed only with recovered (4 in our case) results.
  2. Implement itemReader(List) with list, and then do for each in processor and in writer, but I think there must be another way to do it.
  3. Creating a auxiliar additional list in reader, and save all recovered items in there, and then checking if new recovered lists have same records, if not, they processed normally, if they do have duplicate it returns null, so reader is finished and then going to processor

Sorry for my bad English, and thank you.