Lung ultrasound imaging is nowadays receiving growing attention. In fact, the analysis of specific artefactual patterns reveals important diagnostic information. A- and B-line artifacts are particularly important. A-lines are generally considered a sign of a healthy lung, while B-line artifacts correlate with a large variety of pathological conditions. B-lines have been found to indicate an increase in extravascular lung water, the presence of interstitial lung diseases, non-cardiogenic lung edema, interstitial pneumonia and lung contusion. The capability to accurately and objectively detect and localize B-lines in a lung ultrasound video is therefore of great clinical interest. In this paper, we present a method aimed at supporting clinicians in the analysis of ultrasound videos by automatically detecting and localizing B-lines, in real-time. To this end, modern deep learning strategies have been used and a fully convolutional neural network has been trained to detect B-lines in B-mode images of dedicated ultrasound phantoms. Furthermore, neural attention maps have been calculated to visualize which components in the image triggered the network, thereby offering simultaneous weakly-supervised localization. An accuracy, sensitivity, specificity, negative and positive predictive value equal to 0.917, 0.915, 0.918, 0.950 and 0.864 were achieved in-vitro using data from dedicated lung-mimicking phantoms, respectively.