My past work has led me to a convergence of AI, real-time sensing, connectivity, and automation that will allow autonomous discovery across science enabled by ORNL's big team science opportunities. Autonomous discovery relies on self-driving labs and AI to determine next experiments, collect data, and analyze with limited human intervention, with the ultimate goal of benefitting research and augmenting the work force. I am currently developing soil incubations with real time measures and analysis, a robotic system to image petri dishes, and a real-time dashboard leveraging the ORNL INTERSECT microservices framework. An AI collaborator can interpret visualized plots, provide suggestions, and discuss with the user via a chat interface. While self-driving labs for pharmaceuticals and microbial synthetic biology are now widespread, automating plant, microbe, and soil research requires a step towards more general purpose robotics rather than liquid and plate handling. While the biological applications are discussed below, I am driven to create: the experiment you can talk to.