Study 7: Schedules of Reinforcement
8 March 1999
Abstract
Rats that have acquired the operant response of food-reinforced
lever pressing will be placed on variable interval schedules of
reinforcement.
To shape variable interval responding, rats will first receive
reinforcement according to a continuous schedule of reinforcement.
The variable interval requirement will then be introduced and subsequently
increased in a systematic manner.
Introduction
As Skinner noted in his survey of his work (Skinner 1959),
the study of reinforcement often involves different
contingencies, or schedules of reinforcement.
Indeed, Skinner once published a book consisting
entirely of the analysis of cumulative records from different
schedules (Ferster and Skinner 1957).
Our project for this week is less grand.
For those of you who successfully shaped your rat in
Study 6, (and, I think that includes everyone)
this project will involve the measurement of
performance on a variable interval (VI) schedule.
VI schedules have been studied so often that it is
difficult to find a new point of view.
Nevertheless, we will be collecting data that are
relevant to a still unresolved issue regarding the processes
that control VI responding.
Specifically, we will be looking at interresponse times
(IRTs) during VI responding.
(Platt 1979) has proposed that VI reinforcement
schedules serve to strengthen particular IRT distributions
and has suggested a model of behavior based on this idea.
Indeed, Staddon and Ettinger (1989) have even
suggested that IRT analysis might be relevant to applying
the new mathematical techniques of chaos theory to operant
behavior.
Methods
Subjects:
Our Sprague-Dawley rats will serve as subjects.
Apparatus:
We will be using the six custom-constructed
chambers to test our animals.
Each of these chambers will be fitted with a response
lever that the animal can press.
In addition, we will use an electric clock to
measure time intervals during training.
Reinforcers will consist of chocolate sprinkles
delivered to the food cup in each box after a tap on the
chamber wall.
Procedure:
Our lab will again be 90
minutes long.
During that time we will reinforce the rats according
to a VI schedule.
We will gradually work up to a VI 100 sec schedule.
During the VI 100 phase, the time it takes the rat to
perform a series of 5 responses will be measured.
Start by reinforcing your rat according to a CRF
schedule.
Deliver 5 responses on this schedule.
Then deliver 5 reinforcers according to a VI 10 sec
schedule.
Begin with the first time interval in Table 1 (these tables will
be distributed in class).
Wait this amount of time and then reinforce the next
response that occurs after this time interval has elapsed.
Deliver 5 reinforcers on this schedule.
Then deliver 10 reinforcers according to a VI 20
seconds schedule; intervals for this schedule are given in
Table 2.
Finally, switch to the VI 100 sec schedule.
These intervals are given in Table 3.
During the VI 100 sec schedule, record the time it
takes the rat to perform five responses.
That is, note down the time that every fifth response
is made and subtract the previous time to get the "waiting
time" to the fifth response.
Note which of these contains the delivery of a
reinforcer.
After the session, weigh and feed your rat.
Results
We will try two different kinds of analysis of these
data.
The traditional means of analysis of operant behavior
is by use of the "cumulative recorder".
We don't have enough cumulative recorders for all
the chambers in our class, but we can construct a
cumulative record from the waiting times.
Each waiting time marks an additional 5-response
increment in the cumulative number of responses.
If the cumulative total of responses is plotted against
the cumulative time, we have a rough version of the
cumulative record.
Try plotting out your rat's behavior in this way.
What kinds of biases are induced in constructing a
cumulative record in this manner?
The second type of analysis is an IRT distribution
analysis.
Strictly speaking, this analysis requires the waiting
time to single responses.
For reasons of practicality, we're using waiting times
to the fifth response, but the same considerations can be
shown.
Begin by constructing a frequency histogram of all
the waiting times, excluding those that contain a reinforcer
delivery (the time eating a sprinkle will complicate the
analysis).
Plot the histogram for the entire session.
What shape does the histogram seem to show? Now,
take the waiting times in the first third of the VI 100 sec
phase (e.g., if you have a total of 60 waiting times, take the
first 20 of these) and construct a histogram.
Compare this to histogram of the last third of the
phase.
Do you think the VI 100 sec training has affected the
VI distribution?
Skinner (1959) has argued for the importance of
cumulative records in behavior analysis.
Basically, he would argue that cumulative records
provide better information than that from something like our
IRT analysis.
Do you think the IRT analysis provides any useful
additional information?
References
Required
Platt, J.R. (1979) Interresponse-time shaping by variable-interval-like interresponse-time reinforcement contingencies. Journal of the Experimental Analysis of Behavior, 31: 3-14.
Recommended
Ferster, C.B. and Skinner, B.F. (1957) Schedules of reinforcement. New York: Appleton-Century-Crofts.
Skinner, B.F. (1959) A case history in scientific method. In: S. Kosh (ed.) Psychology: A study of a science. Volume 2 New York: McGraw Hill, pp. 359-379.
Staddon, J.E.R. and Ettinger R.H. (1989) Learning: An introduction to the principles of adaptive behavior. San Diego, CA: Harcourt, Brace, Jovanovich.
Page Created: 7 March 1999 Last updated: 7 March 1999
Site maintained by: Michael R. Snyder <msnyder@psych.ualberta.ca>
|