Understanding big and complex scientific data is still an immature topic. It involves studying visualization methods to faithfully represent data, on the one hand, and designing interfaces that truly assist users with data analysis, on the other hand. In an earlier study, we developed guidelines for choosing display environment for four specific, but common, data analysis tasks: identification and judgment of the size, shape, density, and connectivity of objects in a volume. The results showed that using the fish tank virtual reality (VR) system was significantly more accurate at judging the shape, density, and connectivity of objects and significantly faster than the immersive Head-mounted display VR system. Based on those results, we asked the question whether or not the user performance could be further improved by adding tangible elements into the fish tank VR system. We propose several different interface prototypes of a clipping plane that have been realized with the help of wireless vision-based tracking. These prototypes allow to experience and evaluate those user interface strategies for performing the clipping plane function. An experimental study is carried out to quantitatively measure the added value of these tangible interfaces. The result shows that the inclusion of a tangible frame for controlling a virtual clipping plane and the correspondent 2D intersection image into the basic fish tank VR system significantly improve the user performance for the shape, size and the connectivity task.