High throughput screening (HTS) was developed over the last decades with an aim at selecting interesting drug candidate leads within the huge libraries of compounds obtained by combinatorial chemistry. From the early days, the technique consisted in detecting molecules capable of interacting with specifically selected target receptors, enzymes or antibodies so as to sort out the ones susceptible of further development. In the early days, radioactive, enzyme or fluorescent detection techniques were commonly applied in plate reader formats. Throughput was progressively enhanced by increasing the number of wells per plate and by automating the procedures. A further development of the throughput capacity was obtained with the switch from inhomogeneous to homogeneous technologies, where the detection signal is generated directly from the biomolecular interaction, without the need to separate bound from free fractions within the reaction mixture. This advance reduced the manipulation steps, reducing the costs and further improving screening throughput which culminates today in ultra-high-throughput methodologies. The increase in data generated by such technological improvements required a parallel development of softwares capable of handling and analysing the information, at the expense sometimes of missing false negatives. The drug discovery process now faces a new bottleneck resulting directly from HTS developments. Although HTS allows to rapidly identify new drug candidates, the technique does not address the possible applicability of the hits to a biological system. The technology does not give any insight into the possible cellular toxicity of the hit, which is a minimal requirement for further development. Thus, the bottleneck is currently shifting from hit identification to toxicity and biodisponibility evaluation. This new challenge will require a further effort in inventiveness from the part of pharmaceutical scientists.