Apple on Thursday announced via its Machine Learning Journal blog that it will be attending NeurIPS, or the conference on Neural Information Processing Systems, which is running next week from December 2 through December 8 in Montréal, Canada.

It will have a booth at NeurIPS staffed with machine learning experts from across the company. The post goes on to advertise 137 open jobs in the various fields of machine learning, including artificial intelligence, computer vision, data science and deep learning, machine learning infrastructure, reinforcement learning, language technologies and so forth.

To apply for those positions, visit Apple’s Jobs website.

The Cupertino tech giant used to prohibit its machine learning and artificial intelligence scientists from publishing their findings or attending specialized conferences, but that haschanged with the arrival of the Machine Learning Journal blog in July 2017.

Since the past few years, Apple has been developing the hardware stack and the semiconductor stack with machine learning in mind. 2018 iPad Pros and the last two iPhone generations have benefitted tremendously from Apple’s dedicated machine learning hardware, dubbed the Neural engine, which tremendously speeds up heavy computational tasks such as image and object recognition, voice analysis, artificial intelligence, scene analysis and more.

The Neural Engine is embedded in Apple’s A-series chips starting with last year’s A11 Bionic. This year’s A12 Bionic has even faster Neural Engine hardware capable of executing up to five trillion instructions per second. This has enabled up to 10x faster image and scene analysis versus the A11 Bionic chip, making possible advanced computational photography features and imaging capabilities like real-time Depth Control and Smart HDR.

On iPhones without the Neural Engine, iOS falls back to the CPU and GPU for these tasks.

Back in the summer, Apple hired Google’s seasoned veteran and former chief of Search and AI, John Giannandrea, and merged its Core ML and Siri teams.