With the new Docker-based installation you can get a basic, private Datalore setup running in less than 10 minutes. Whether it’s on AWS, GCP, Azure, or an on-premises machine, you’ll be able to set up Datalore by running a single Docker command. Connecting authentication modules, setting up internal usage plans, and customizing environments can be further carried out step-by-step. See a detailed comparison table between Docker-based and Kubernetes-based installations here.
Use scheduling to run your notebooks on an hourly, daily, weekly, or monthly basis and deliver regular updates to published reports. Choose the schedule parameters from the user interface or use the CRON string. Notify notebook collaborators upon successful or failed runs via email.
Get one-click access to all the necessary computation options from the left-hand sidebar. Configure scheduling options, machine types and kernel settings from one place.
Get a fully collaborative experience by editing Python scripts and other files attached to the notebook together with your team members. You’ll be able to see collaborators' cursors in the right-hand sidebar editor and get real-time updates to the files’ contents.
Choose specific database schemas and tables for introspections when creating a database connection in Datalore. This will help speed up the initial introspection and make schema navigation easier.
Now you can connect to an MS SQL Server database right from the editor interface – navigate to the schema and get code completion for SQL queries inside SQL cells. Read more about SQL support here.
In Datalore, it is now possible to use variables (strings, numbers, booleans, lists) defined in Python code inside the SQL cells. This allows you to build interactive reports with parameterized queries, helps minimize the SQL code written, and presents a better UI for report users.