d-Matrix · 1 week ago
Compiler Software Engineer Intern
Maximize your interview chances
Artificial Intelligence (AI)Cloud Infrastructure
H1B Sponsor Likely
Insider Connection @d-Matrix
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
The Compiler Team at d-Matrix is responsible for developing the software that performs the logical-to-physical mapping of a graph expressed in an IR dialect (like Tensor Operator Set Architecture (TOSA), MHLO or Linalg) to the physical architecture of the distributed parallel memory accelerator used to execute it.
It performs multiple passes over the IR to apply operations like tiling, compute resource allocation, memory buffer allocation, scheduling and code generation.
You will be joining a team of exceptional people enthusiastic about developing state-of-the-art ML compiler technology.
In this role you will design, implement and evaluate a method for managing floating point data types in the compiler.
You will work under the guidance of two members of the compiler backend team.
One, is an experienced compiler developer based in the West Coast of the US.
You will engage and collaborate with engineering team in the US to understand the mechanisms made available by the hardware design to perform efficient floating point operations using reduced precision floating point data types.
Successful completion of the project will be demonstrated by a simple model output by the compiler incorporating the your code that executes correctly on the hardware instruction set architecture (ISA) simulator.
This model incorporates various number format representations for reduced precision floating point.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor’s degree in computer science or equivalent 3 years towards an Engineering degree with emphasis on computing and mathematics coursework.
Proficiency with C++ object-oriented programming is essential.
Understanding of fixed point and floating-point number representations, floating point arithmetic, reduced precision floating point representations and sparse matrix storage representations and the methods used to convert between them.
Some experience in applied computer programming (e.g. prior internship).
Understanding of basic compiler concepts and methods used in creating compilers (ideally via a compiler course).
Data structures and algorithms for manipulating directed acyclic graphs.
Preferred
Familiarity of sparse matrix storage representations.
Hands on experience with CNN, RNN, Transformer neural network architectures.
Experience with programming GPUs and specialized HW accelerator systems for deep neural networks.
Passionate about learning new compiler development methodologies like MLIR.
Enthusiastic about learning new concepts from compiler experts in the US and a willingness to defeat the time zone barriers to facilitate collaboration.
Company
d-Matrix
D-Matrix is a computing platform that targets artificial intelligence inferencing workloads in the datacenter.
H1B Sponsorship
d-Matrix has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (4)
2022 (9)
Funding
Current Stage
Growth StageTotal Funding
$154MKey Investors
Temasek HoldingsTSVC
2023-09-06Series B· $110M
2022-04-20Series A· $44M
2021-01-13Seed· undefined
Recent News
MarketScreener
2024-11-20
2024-11-20
Company data provided by crunchbase