ZapLab : a visual environment for associative information retrieval of learning assets and objects

W. Veen, S.C. Santema, G. Pasman, A. Toet

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

This paper addresses the need for new learning environments for next generation learners. Homo Zappiens has developed information-processing skills integrating visuals and sounds, movements and text. This paper describes the development of an application called ZapLab, an interactive environment for classifying and retrieving learning assets and objects according to individual learning needs. It supports learners and facilitate non-linear learning behaviour by visually presenting learning resources according to resource characteristics, their interrelationship, and search criteria chosen by the learner. ZapLab is a working prototype which has been developed for uses in higher education. Resources of an existing blended and flexible elective were used to experiment the design for the environment which have been tested for user experience.
Original languageEnglish
Title of host publicationProceedings World Conference on Educational Multimedia, Hypermedia and Telecommunications (ED-MEDIA 2004), June 21-26, 2004, Lugano, Switzerland
EditorsL. Cantoni, C. Mcloughlin
Place of PublicationNorfolk VA, USA
PublisherAssociation for the Advancement of Computing in Education
Pages1668-1676
Publication statusPublished - 2004
Eventconference; Ed-Media 2004; 2004-06-21; 2004-06-26 -
Duration: 21 Jun 200426 Jun 2004

Conference

Conferenceconference; Ed-Media 2004; 2004-06-21; 2004-06-26
Period21/06/0426/06/04
OtherEd-Media 2004

Fingerprint

Dive into the research topics of 'ZapLab : a visual environment for associative information retrieval of learning assets and objects'. Together they form a unique fingerprint.

Cite this