Watch the below video, or alternatively, read the transcript, then move on to Consolidation Exercise 1.
We know from previous lessons that behaviours that are reinforced increase in frequency in the future.
However, behaviours are reinforced at different rates.? Some behaviours are reinforced every time they occur, while others are only reinforced some of the time.
Take the example, of a person who buys a lottery ticket.? Their behaviour may be reinforced only occasionally and yet the behaviour persists.
In this lesson, we?ll look at how the rate at which reinforcement occurs impacts on the occurrence of behaviour.
Schedules of Reinforcement
Continuous Versus Intermittent
When a behaviour is reinforced every time it occurs, we say that is on a continuous schedule of reinforcement. It is contrasted with an intermittent schedule of reinforcement, where not every occurrence of a behaviour is reinforced. Reinforcement is only provided intermittently.
A good example of a behaviour that is on a continuous schedule of reinforcement is using a vending machine to buy chocolate.? Everytime, you put the money in the machine, you get a reinforcer. By contrast, when using a slot machine, you only receive reinforcement for putting money in the machine on an occasional basis.
However, these categories of schedule are quite broad and within ABA we generally like to describe schedules in more detail.
Fixed Ratio Versus Variable Ratio
In a fixed ratio (FR) schedule, a specific or fixed number of responses must occur before the reinforcer is delivered.? A behaviour that is continuously reinforced is on an FR1 schedule because every occurrence is reinforced.? A behaviour that would only be reinforced after three responses, would be described as being on an FR3 schedule. While a behaviour that is reinforced after 10 responses, would be on an FR10 schedule.
A good example of where you can see these types of schedules would be in industrial and academic settings. For example, a child may receive a token at school for correctly answering 10 questions correctly. Similarly, a worker in a factory may be on a piece rate and received ?10 for every 20 items packaged.
Fixed ratio schedules are contrasted with variable ratio schedules.
When a behaviour is on a variable ratio schedule, the number of responses required each time varies around an average number, that is, a reinforcer is delivered after an average number of responses.
For example, a student might be on a VR8 schedule for completing Maths problems. That means they would receive a reinforcer on average after 8 correct responses. The first time they receive reinforcement might be after completing the 7th problem. The second time might be after completing 9 problems. The third might be after the 8th.??
Similarly, a call centre worker who receives payment for sales of insurance products could be described as being on a VR8 schedule of reinforcement if the average rate at which they made sales was after 8 calls.? Some days, they might make two sales in a row. On other days, they might make a sale after 20 calls. But if the average number of calls taken to make a sale was 8, we describe the behaviour as being on a VR8 schedule.
Fixed Interval Versus Variable Interval
With an interval schedule, a response is reinforced only after the passage of time.? That is if a behaviour is on a fixed interval of 1 minute (Fi1), then the first instance of the behaviour after 1 minute will be reinforced.
Imagine a teacher who has assigned a student a task of completing Maths worksheets. Every 5 minutes, she gets up and goes over to her student to check their work. If the student completes their current maths problem correctly, she praises them. In this scenario, it does not matter, how many of the problems completed earlier were correct or incorrect. Reinforcement is available on a Fixed Interval of 5 minutes (Fi5).
Similarly, if a behaviour is on a variable interval, the time interval varies. For example, a kitchen worker might prepare meals under the supervision of a chef.? The chef praises his workers for staying on-task whenever he observes them.? We would say that the kitchen worker is on a variable interval 20 minute schedule (Vi20) if, on average, the chef observed and praised the worker for being on-task every 20 minutes. Sometimes, there might be a 5 minute interval between opportunities to receive praise. At other times, the interval might be 60 minutes.
Behaviour Change: Extinction and Maintenance
In our earlier lessons, we discussed extinction.? Extinction is also a type of reinforcement schedule ? one in which a previously reinforced behaviour is no longer reinforced whenever it occurs.
A behaviour that has been reinforced on an intermittent basis is more resistant to extinction than those that have been continuously reinforced. That is, when placed on extinction, a behaviour that has been intermittently reinforced will take longer to reduce to zero levels.
When a behaviour is reinforced frequently, we say that the behaviour is on a ?thick? schedule of reinforcement. When a behaviour is reinforced infrequently, we say that is on a ?thin? schedule of reinforcement. The thicker the schedule of reinforcement that a behaviour was on, the quicker it will extinguish.
This knowledge is important when it comes to clinical practice.? If we are considering using an extinction intervention to help a client decrease a behaviour, we know that the procedure will take longer to work if the behaviour is currently occurring on a thin, intermittent schedule of reinforcement.
If, on the other hand, we are trying to help increase a behaviour, we know that there is a good chance that a behaviour that is currently being reinforced on a thick or continuous schedule will likely not maintain over time once the intervention ends.
This is why behaviour analysts will frequently use thick or continuous schedules of reinforcement to help establish a behaviour and then ?thin? those schedules over time ? moving from high levels of reinforcement to that which is more common outside of the instructional environment.