This paper develops a framework for modeling dynamic choice based on a theory of reinforcement learning and adaptation. According to this theory, individuals develop and continuously adapt choice rules while interacting with their environment. The proposed model framework specifies required components of learning systems including a reward function, incremental action value functions, and action selection methods. Furthermore, the system incorporates an incremental induction method that identifies relevant states based on reward distributions received in the past. The system assumes multi-stage decision making in potentially very large condition spaces and can deal with stochastic, non-stationary, and discontinuous reward functions. A hypothetical case is considered that combines route, destination, and mode choice for an activity under time-varying conditions of the activity schedule and road congestion probabilities. As it turns out, the system is quite robust for parameter settings and has good face validity. We therefore argue that it provides a useful and comprehensive framework for modeling learning and adaptation in the area of activity-travel choice.