Zak Niazi [Image: Courtesy of Circle Optics]
The July/August 2023 issue of Optics & Photonics News featured the magazine’s biennial feature spotlighting 10 Entrepreneurs to Watch. Here, we offer an interview with one of those entrepreneurs, Zak Niazi, the founder and CEO of Circle Optics, USA. Niazi says that with Circle’s “stitchless” 360-degree multi-camera system, which solves the problem of parallax using abutting polygonal lenses, the company aims to both “democratize” immersive experiences and enhance safety and security.
Can you give me a few sentences about what Circle Optics is doing and where you want to take it?
We built the world’s first “stitch-free” 360-degree camera. Stitching is this problem of how to fuse together multiple images to create a panorama—it typically takes several hours to complete the process for a single minute of content. And so it’s really not a scalable process to create 360-degree video.
The company initially started just trying to make it easier and more accessible to collect 360-degree video in a scalable way, where all the user has to do is press a button, and they get the video output instantaneously. We’ve eliminated a problem that can cost people upwards of $3,000 per minute to produce.
Can you tell me about the core technology that enables Circle’s approach to this problem?
We solved the problem optically. Basically everyone else right now is taking commercial off-the-shelf components and stitching images together after the fact, which results in stitching artifacts and stitching errors. And we’ve invented lenses that are designed to be stitchless from the ground up.
Whereas a commercial off-the-shelf lens has a circular field of view that overlaps an adjacent camera’s circular field of view, we’ve designed polygonal lenses, where the lenses can abut. Where the lenses touch, the fields of view conjoin, so that the images can be snapped together seamlessly.
Another way to look at this is, we figured out a way to co-locate the centers of prospective or the entrance pupils of multiple lenses to create a combined panoramic image without any prospective errors. So in a nutshell, we’re basically solving the problem optically, whereas everyone else is trying to solve the problem computationally and bandage the solution after the fact.
One of Circle’s stated aims is to “democratize immersive experiences.” Can you talk about what that means to you and why you think it’s so important?
Right now, in order to be able to have an experience like traveling around the world, you need to hop on an airplane. And a lot of people either can’t afford to do that or because of physical limitations it’s not really feasible. And even if you could afford it or choose to do it, most people only see a handful of places in their lifetime. It takes a long time to go see places. Most people haven’t experienced most of the wonders of the world because there’s a lot of friction involved.
So really what we’re trying to do here is make experiences as accessible as information is today with the internet. You can access the world’s information at your fingertips with Google, ChatGPT—whatever question you have has been answered because the internet gives you access to all this information. But I think the next frontier is going to be democratizing experiences.
And we have the basic infrastructure already set up to enable this: virtual-reality technologies, Google Streetview, panoramic images that can be found online, web infrastructure to deliver high-resolution, high-quality data instantly.
But right now, it’s inaccessible to capture the content. If it costs you US$3,000 per minute to stitch the content together, it’s just not possible to map the streets of the planet and democratize experiences. So that’s really the inspiration that got us started—wanting to make anyone able to access anywhere in the world instantly, to see what it looks like in 360 degrees and be able to learn about a place remotely without ever having to be there.
Circle targets a variety of markets, including defense, drones and entertainment. How do you see your tech fitting into these different areas?
So we started out in the defense market, in aerospace, building 360-degree cameras for virtual-reality experiences. The US Air Force has a need for these cameras to create training content of places around the planet, to create 360-degree immersive experiences.
Early on, when we were starting the company, we realized it’d be really hard to scale the product just focusing on immersive entertainment because it’s hard to find capital to actually build the product. No one’s willing to pay to get the technology to completion. And we found that in defense, there’s a lot of financing through the Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, where the government will actually pay you to get through that valley of death, which is the toughest part for a startup. So they pay you to develop these prototypes.
“There’s a lot of financing through the Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, where the government will actually pay you to get through that valley of death.”—Zak Niazi
So that’s how we’ve been financed to date. We’ve pulled in, I believe, north of about US$8 million roughly to date in federal contracts. And that’s helped us to push the technology to completion. It’s really the same type of markets that we’re looking to get after in terms of immersive entertainment, media and entertainment—the US Department of Defense (DOD) has the exact same use case, they need to create 360-degree content, virtual-reality content. So they’ve been able to bootstrap the technology for those applications.
The other use case that we found is working with NASA for one of these SBIR contracts. I didn’t realize until working with Circle Optics that NASA is actually responsible for a lot of the airplane technology that makes our airways safer. So in addition to sending people to the moon, they’re actually doing a lot with technology in the Federal Aviation Administration (FAA) to make airplanes safer, and now make drones and flying cars safer.
So we got involved in some projects with NASA to develop 360-degree technology to make flying car technology safer. They realized that they needed a panoramic camera that can see both a wide field of view but can also see very far away. And all the cameras on the market that have a wide field of view don’t see very far, or the cameras that see very far don’t have a wide field of view. So we developed them a camera that, because of this stitchless aspect, actually has multiple benefits. Because you’re solving the problem from a “physics-first” perspective, you’re more efficient with the technology, and you can actually get better resolution for a wider field of view, letting you see further.
This physics-first solution has applicability in other industries as well. So for instance, advancing autonomy. And we’re actually miniaturizing the technology that we’re developing for NASA for the drone community to help make drones safer and enable them to see much farther with a wider field of view for detecting and avoiding potential threats.
Can you talk a little bit about your different product offerings?
Hydra II is our flagship product, the 360-degree camera. So that’s the product for virtual-reality content, for creating 360-degree immersive content, and it eliminates the high cost.
The Hydra camera [Image: Courtesy of Circle Optics]
Pegasus is the drone product. So that’s a product for small unmanned aircraft systems (UAS), the NASA product that we’ve miniaturized. And what distinguishes that from alternative technologies is a wider field of view, with the ability to see farther, than the other products on the market. The best products on the market right now see about a mile out. We’re doubling that, so we’re going to see about two miles out. And we have some other products in the works that can see upwards of six miles out.
And Circle Optics offers engineering services as well?
Right now, we’re providing engineering services for the DOD through these SBIRs. It’s really a mechanism to help to further the technology—we find a customer who has a need for the drone product or a Hydra II camera, and they help to finance the nonrecurring engineering (NRE) to develop that product further. They’ll have some specifications, they’ll say, we want these bells and whistles attached.
But really, it’s a very synergistic type of thing that helps us to advance the commercialization of the products. So that’s the third offering we have; we’re willing to do NRE, not just for the government, but for commercial companies. That’s something we’re looking to get into, broadening so that we’re also working with commercial contractors to do the same types of contracts.
When did you first start to become interested in science, and specifically in optics and photonics?
In my freshman year of college, I was interested in studying philosophy and astrophysics. And I remember talking to someone, and he said the technology that’s used to understand astronomy is telescopes and optics. If you understand optics, you understand the technology that lets us peer into the far depths of the universe.
In terms of philosophy, the nature of light is one of the most philosophical questions there is. Is it a wave? Is it a particle? Getting to the fundamental quantum mechanics of wave-particle duality. And it intrigued me that this person said, if you want to study both philosophy and astronomy, why don’t you give optics a try? So I studied it.
And initially, I found it to be very, very challenging, in my freshman and sophomore years. And I think probably the fact that it was so challenging was why I stuck with it. I’m really drawn to tough things. And by the time I got to my junior and senior year, I started to realize that all the technologies you can imagine coming out in the future, you think of “Minority Report,” AR displays, holograms and all these different kinds of technology—they’re all going to rely on optics.
I think a lot of people are looking to computer technology and computer science, and there’s a lot to be had there for sure. But I think if you want to look to the future, it’s really going to be in the world of photons and atoms. And optics is going to play an integral part of the future technology. So I started to see that in senior year and wanted to get involved in business.
“I think if you want to look to the future, it’s really going to be in the world of photons and atoms. And optics is going to play an integral part of the future technology.”—Zak Niazi
Had you always been interested in starting a business?
That started around junior year, I took some business school classes at the Wharton School at the University of Pennsylvania, USA. I got into a summer program there for technology, where they show engineers potential paths in business and what’s possible. And so I took a few classes in entrepreneurship, and that got me thinking about all these different kinds of optics startups.
How did your founding team initially come together?
I bootstrapped the company on my own for a few years in New York City. I was actually contracting out the initial build of the camera to different optics contractors, and I went for a very long time without a team. But then in 2019, we got accepted into the Luminate accelerator, and we moved the company to Rochester, NY, USA. And when we moved to Rochester, I realized instantly that this was the right place to start a company.
In New York City, office space is super expensive, and there’s no talent for what we’re interested in. There’s lots of talent when it comes to business-oriented folks, financial folks, ad tech, fin tech. But when it comes to optics, there was very little talent to be had. And when we moved to Rochester, it was kind of like planting a seed into fertile soil. That was really the right hub to be starting an optics company.
It was much cheaper, and we also brought on board the initial seven or eight people who were core to the company within the first year. One of those people was from Luminate. He was a judge, and he liked what he saw so much that he joined the company. And then he brought a bunch of his colleagues along. He used to work at IMAX, he was a senior director of R&D over there. So we have sort of this mini IMAX hub.
They’ve let go of a lot of engineers in Rochester, there’s been downsizing at a lot of companies like Kodak and IMAX, but it’s great for startups like us because that’s amazing talent that needs a place to go. So we picked them up.
“We built a culture where people are free to be themselves, and we encourage people being themselves.”—Zak Niazi
And once you get the initial team going, people want to work with each other. They start to attract future engineers. We built a culture where people are free to be themselves, and we encourage people being themselves. So every meeting you go to, there’s a lot of joking, there’s a lot of laughing. People are having a good time because they are comfortable being authentic. And that’s the kind of culture we tried to create. I think that’s important for innovation, but also for people to enjoy a good work-life balance.
Can you talk me through the story of getting the company off the ground and funded?
The story started when I was taking a lens design class as a college student. I was asking the question, if Google has cameras on top of cars mapping the planet, why can’t you strap on a headset and roam around Venice, Italy, or the Great Pyramids of Egypt, and share that content?
Through that course, I talked to the director of engineering for Immersive Media, which was the company that sold Google their initial 360-degree cameras. He explained the problem of parallax. The reason you can’t get these immersive experiences is they’re too expensive to create because all cameras today suffer these perspective errors. And so I spent my senior lens design class trying to build a camera that didn’t have that issue. Ultimately, I failed. I couldn’t figure it out; I got a bad grade in the class. But I just kept thinking about the problem.
I worked in academia for a little bit, and then I decided to take off six months and try to solve the problem. I rigged together 10 Windows 97 PCs, and I was running Zemax simulations around the clock. One simulation would start to look promising, so I would seed that solution to all of the 10 computers, start running them in different directions with the optimization algorithm.
And eventually I came to a solution that looked pretty promising within those six months, and I solved the core of the problem. I found some initial seed funding through family and friends to get off the ground and take that design to a third party and try to turn it into something more. And I was able to build a first prototype through that.
And then, once there was a prototype and an initial patent filed, that’s when we started to get some outside financing. We went to Luminate and got the financing there; we were one of the winners of Luminate later in the year, and we started to raise a lot more financing toward the end of that program.
And we’ve since taken investment from people like the inventor of Google Streetview. So the original vision kind of came full circle. And then we found government contracts. That was much more fruitful than investment, especially for hardware investment.
Do you see the government as a key customer going forward?
It’s a key customer in the short term. One of the things about government contracts is, a lot of people think they’re going to want rights over technology. They’re going to want the technology just for their application. And it’s actually the complete opposite. When you go after these contracts, a big part of what you’re graded on is how commercializable the technology is. The government doesn’t want to be the sole financer of technologies; they would much rather buy commercial off-the-shelf components.
For example, take the self-driving car community and lidar. The cost of lidar and a lot of these components have gone down over time because there’s a lot of volume. And so the government is able to capitalize on purchasing components that have enabled autonomy, self-driving cars, and things like that because there’s a lot of volume in these industries.
Whereas you take something like infrared cameras, there’s very little in the commercial segment. And so that’s almost exclusively funded through the DOD, and they pay through the roof to get these infrared cameras. So DOD, Air Force in particular, they judge you based on how commercially viable the solution is. They want to see a dual use. Even if they’re going to finance the technology in the short term, there’s got to be a larger overall market. So they’re fully on board with that, and the partners that we have are fully on board with getting that much larger market.
So they’re really the initial investors, and they will be customers going forward of the technology. But there’s really going to be a larger commercial market that we’re that we’re tackling. Virtual-reality content, virtual-reality cameras, and the drone industry. And then DOD will benefit by getting lower unit costs because of those larger markets.
You’ve raised over $200,000 through WeFunder. Can you talk about your experience with crowdsourcing and why you decided to pursue this funding outlet?
We had a lot of people who were involved with our story from the beginning and said they would love to be a part of it, if there was a mechanism. For a long time, we had to turn people away and couldn’t accept their checks because they weren’t accredited. We only took investments from accredited investors.
“We decided to open up a community round to give everyone who’s been with us for a long time the opportunity to invest.”—Zak Niazi
But we decided to open up a community round to give everyone who’s been with us for a long time the opportunity to invest. And we foresee probably doing future financing like that as we go forward and start to build more of a community. There’s a little bit of marketing to it as well, where you can build a community of people who are more invested in your product, your technology and your solution. And some of those people could eventually be early adopters and customers of the content. And so we see it as trying to start some sort of community as well.
Do you have lessons you’ve learned that you would share with someone looking to walk the same road?
I think one of the key things is, if you’re doing deep tech, it’s very different than if you’re starting a software or app company. If you’re starting an app company, the metrics that an investor will judge you by are completely different. Investors will be willing to take market risk. So most of the time, they’ll put investments in without you having customer validation.
But when it comes to a hardware company, they’re willing to accept zero market risk because the costs to develop it are so great, and the margins are so low on the other side. They’re willing to accept zero market risks, which is a really tough proposition. Because oftentimes, to be able to validate your market, especially when you’re a deep tech company, it takes a lot of capital to get up that curve. And that’s something I didn’t fully appreciate at the beginning, just how much capital it takes and how scrappy you’ve got to be to get that far.
Another thing is, I probably would have jumped into SBIRs and STTRs much sooner. I didn’t realize that was a valid financing mechanism, and I stayed away from it for a long time. So that’s what I would advise to anyone in particular starting a lens company, an optics company, a laser company—these are a lot more capital intensive, and there’s not a lot of appetite from investors unless you can validate it. And to validate it, you have to build some fundamental components or fundamental technology, unless you’re piecing together off-the-shelf components, which some companies are doing.
“If you’re designing something from scratch that’s truly deep tech, truly novel, you’ve got to figure out some way to finance that, and investors aren’t going to do it.”—Zak Niazi
But if you’re designing something from scratch that’s truly deep tech, truly novel, you’ve got to figure out some way to finance that, and investors aren’t going to do it. You’ll find some angel investors to go a little bit, US$25,000 or US$50,000 just based on the idea, but to really get capital, I would advise entrepreneurs to look into SBIRs and STTRs sooner.
And the other thing is how critical it is to find the right ecosystem. Being in New York City was not the right ecosystem for an optics company. Reid Hoffman likes to say that if you’re going to start a company, start it in the right ecosystem for your startup. If it’s a web app or software that’s mostly likely Silicon Valley. If it’s fin tech or ad tech, that’s New York City. But if it’s optics, photonics, that’s going to be Rochester; Arizona; Jena, Germany. You have to find the right ecosystem for your company.
Can you talk to us a little bit about where the company stands now, both in terms of funding and products?
The first two products are actually coming online this year. In July, we’re going to have the Hydra II system release, and we’re going to start to create content with it. We’re going to be making the first 10 pieces of 360-degree content. Right now, it would cost potentially about half a million dollars to produce the type of content we’re trying to create, which will be hour-long, immersive walkthroughs of different locations around New York State: Central Park, the Highline, a bunch of places like that. So that’s the first thing that people will see.
And then the drone product is going to be flight tested later this year into early next year. We’re looking for flight test partners, drone manufacturing partners to integrate with. The initial results from that testing will be released in about the first quarter of 2024.