Early last month, ARK Multicasting announced the successful completion of its ATSC 3.0-based content delivery network trial.
Under the auspices of the Streaming Video Technology Alliance, the LPTV broadcast group with a goal of putting 300 ATSC 3.0 stations on air to support a nationwide data delivery network, worked with companies like Hewlett Packard Enterprise, Lumen Technologies, Broadpeak and SiliconDust to demonstrate the viability of the Open Cache Standard in its 3.0-based datacasting model.
The company has already conducted conversations with a broad range of potential customers looking to leverage its 3.0-based datacasting service to do everything from deliver firmware updates to stream OTT video content over the air.
In this interview, company co-founder and CEO Joshua Weiss discusses the progress ARK Multicasting is making in realizing its vision for ATSC 3.0-based datacasting, who’s interested in the service and when he expects the company to begin generating revenue.
(An edited transcript.)
TVTech: What is ARK Multicasting’s vision for ATSC 3.0?
Josh Weiss: ARK Multicasting is an operating company building an ATSC 3.0 business across a portfolio of low-power television stations. Our intent for what we’re building is to utilize the majority of the capacity of these low-power stations to do datacasting across the entire city where one of our stations is located.
(opens in new tab)
The way we view both the spectrum and the opportunity is that we’re building a pipe that can be used as part of the greater network architecture for getting data from one to many in the most efficient, effective way that enhances communications architecture, improves networks, eliminates network congestion, benefits content owners, benefits internet service providers, benefits content delivery networks and ultimately enhances the end user’s experience.
TVT: When will we see 3.0 powering an ARK Multicasting offering?
JW: We have 10 stations we’ve been deploying with the full 3.0 build, complete with the broadcast core and all of the aspects and elements capable of doing 100% what our big picture vision is. In those builds, we are already doing tests and demonstrations to work towards the commercialization. Ultimately, that’s the big question among broadcasters. When will that happen? How will it happen?
Before we take this to customers, we need to demonstrate in a real-world environment that it’s not some sort of lab experiment. It is something you can do today. When we start generating revenue is based on our sales cycle and the work we are doing.
TVT: When do you expect the revenue to begin from 3.0-based datacasting?
JW: It’s difficult to really predict that accurately. Honestly, if we’re going to be real, we have to recognize it’s a new market. There’s a lot of opportunity, but to try to speculate when the date happens, it’s anybody’s guess. Our prediction when we build out our financial models, we’re anticipating 18 to 24 months is when we’re really bringing in revenues.
Our commercialization conversations, which we’re having on a regular basis, are already to the point of talking numbers and contracts. So, I believe that it is very possible that we’ll beat that date. But I want to be realistic—so we project conservatively 18 months.
TVT: I am a little surprised you are talking to possible customers before you have your entire 3.0-network built out.
JW: Contrary to the popular belief that’s out there that says you have to have ubiquity to have success in this space, the surprising thing we’re seeing is that our customers aren’t so much interested in requiring that ubiquitous coverage, because we’re not being considered as a replacement. We’re an accretive augmentation to existing networks.
So, anywhere where somebody is able to do datacasting across an ARK Multicasting station is a benefit to that particular market and their particular customers they’re going to reach, using our broadcast signal.
As a result, if we’re serving a Ford F 150, we don’t have to serve every Ford f 150 in the market. But every Ford F 150 that we do serve is a benefit to Ford and the end user.
TVT: If lack of ubiquitous coverage is not a deal breaker, what are the factors that are important to potential customers?
JW: They’re looking at markets when they’re doing their business analysis. Their business analysis is trying to answer a few questions: What does it look like in a market where there’s congestion from the point of view of land, terrain, buildings, mountains? What does it look like where there’s a density of population in correlation to what’s the connectivity in that market? And what’s the data consumption in that market?
So, you look at a rural city that might only cover 200,000 people. That’s equally as interesting to some of these customers as a market that covers 2 [million] or 3 million people. Because in that 200,000-person market, they might not have good connectivity. There might be a greater span geographically, but the data consumption is still something that they’re trying to answer. So, it’s not just about ubiquity. It’s actually very little about ubiquity, and more about what the accretive augmentation is in each individual market.
TVT: I just want to make sure we are on the same page when using the word “datacasting” as relates to 3.0. When you talk about datacasting, are you referring to video, audio and data streams, not simply data used for something like updating vehicle firmware?
JW: When we think of ATSC 3.0 and the broadcast nature of it, we’re simply viewing it as a way to get data out as Internet Protocol, IP, data packets. Those data packets, may be audio files, may be video files [or] may be firmware updates. They may be cached files, or they may be live real time files. It’s all data.
The reality is data consumption is skyrocketing because resolutions are getting higher, there’s a growing number of devices, there are smart homes and smart cities.
(opens in new tab)
TVT: We recently reported on the Streaming Video Technology Alliance proof-of-concept testing of the over-the-air content delivery network (CDN) that ARK Multicasting helped to lead. What are the major takeaways?
JW: One of the big proposals of the Streaming Video Technology Alliance that we’re in lockstep with is an open caching standard.
Let me explain how this works. When somebody requests that Xbox Game update, or that Yellowstone video file or that firmware update, the ISP [internet service provider] gets the request. The ISP then has to determine what’s the closest CDN to pull the file from, and that CDN gets the benefit of the handshake, and the content owner pays for that experience.
What we found in our conversations with ISPs was they want what we’re offering, but they don’t necessarily have a CDN built into their network, and they don’t want to overstep ethical boundaries of peering into their network to identify what that high-demand content is.
The ISPs want us to bring the CDN with us, which is an important component because the CDN is who can know what the popular content is.
We began working with the Streaming Video Technology Alliance, and Lumen Technologies, which is a CDN, of course, to build out along with Broadpeak what that would look like from an infrastructure and architecture standpoint within our network.
The CDN views this as an opportunity to either A be the CDN of choice when that ISP goes to request a file because it’s the one that’s living closest to the end user, i.e., in the home, or B, the CDN views it as an opportunity to upsell their content owner to give them the choice to live in the home, which is a value-add for the content owner.
We know that for it to work for us, we have to provide the infrastructure from the broadcast station all the way to the end user. We partnered with Broadpeak to build that infrastructure into our network, which in our opinion has to be an open caching standard. [Editor’s note: The Streaming Video Technology Alliance has created the Open Caching specification.]
The Streaming Video Technology Alliance has championed this open caching, which basically says “We’ve got the downstream CDN from the station to the end user, but we’re open to any CDN on the upstream side.”
That means we can work with any content delivery network or the actual content owners if we desire. If it’s all open caching, it’s not proprietary to make it where it only works with one of them.
We worked with SiliconDust, a leader in the 3.0 set-top box space, and they developed the ARK box, essentially a home CDN box that has an Ethernet cable to go at the network level, a 2TB hard drive that serves as the cache, a power cable and an antenna cable. From that one point, the CDN lives in that box thanks to Broadpeak and speaks to the upstream CDN thanks to the Streaming Video Technology Alliance Open Caching Standard. Then a CDN like Lumen, as well as others, is able to be the upstream CDN. The CDN in the ARK box handles the cache or passes through the linear content in a CDN fashion.
Because of our orchestration layer that’s built in—the part of the broadcast core network built by Hewlett Packard Enterprise, any customer along different data verticals can log into our portal on the ARK website, choose any individual station, any region, batch of stations, or the entire nation and send a file, or “Thursday Night Football” or whatever via the stations they choose with just a few clicks of a button.
TVT: From what you’ve said, the ARK solution augments –not necessarily replaces—data delivery networks. Can you give me a scenario you envision to illustrate that?
JW: Think of it like this, when Amazon is putting out “The Marvelous Mrs. Maisel” or “Thursday Night Football,” either of those live or cached, they’re going to be putting it out in multiple different resolutions for different device applications.
They might not care to put all of those through the one-to-many multicast that ARK provides. Maybe they only want to do the most consumed resolution of HD or 4K, whatever that may be. They can choose which resolution they want and have that go through our pipe. The rest goes through the traditional unicast network architecture.
If it’s “The Marvelous Mrs. Maisel” and it’s living in the cache in the hard drive in the CDN in the ARK box, no problem, and if it’s a live thing like “Thursday Night Football,” same thing. No problem.
TVT: How do you plan to market the ARK box to subscribers?
JW: We’re not looking to have a direct relationship with the end user. We’re a B-2-B play. Our customers are the content owners, the cloud CDN providers, the ISPs or some combination of all three.
TVT: Is the incentive then what’s been called the broadcast offload of network traffic for one-to-many applications?
JW: When we talk to ISPs, we’re told when an X Box game update comes out, their network is 65% congested by that one file. There’s motivation for the ISP to deliver a new box [the ARK box] if that new box relieves congestion at that magnitude.
Beyond that, we believe that there’s a business opportunity for an ISP, to participate in some of the benefit that comes from the content owners, who typically only pay the CDN. The ISP is the one that’s left out while their network is being consumed by the same files redundantly traveling across it.
TVT: How do you see your future customers paying for your service?
JW: There are three ways. One, they are charged by a best-case broadcast. They can choose to put out the file they wish to distribute really fast, really robust or really slow trickled out. In any case, they’re paying for bandwidth. How much of our 6MHz is being used? And how long is that taking?
That’s what they’re going to pay for. That’s best-case broadcast. We don’t check on the other end to verify that anyone received it. We don’t check to see if anyone consumed it. They’re responsible for the receive side.
The second approach, which would be a little bit more expensive, is anything verified on the receive side. So, once we’ve broadcast it, if they’re using our receive side, they can choose to pay based on what’s been validated to have been received by the receive device.
The third approach is for them to pay only what is actually consumed out of that box. So, it’s a low-risk. If they’re unsure, fine, only pay for what we actually transmitted that has been verified to have been received and has been verified to have been consumed in that box in a metered fashion.
And that way, you know that if you’re paying for it, it’s because it’s benefiting your overall business case.
TVT: Where does ARK Multicasting go from here?
JW: We’re going to continue plowing forward. It’s an uphill battle because we’re blazing into new territories. It’s a field that hasn’t been tapped yet.
As a result, we’re working hard as pioneers, in a sense, which is very difficult. But we believe, with everything in us that there’s something very real here. The opportunity, once it’s become crystallized, once it comes more in focus regarding what the low-hanging fruit is for commercial partners, we believe that there’s a bright future.
As a result, we’re going to continue being all in on everything that we’ve been doing. That includes continuing to develop what we’ve been developing, continuing to build stations and continuing to look under every stone to gain market input and get smart about what potential customers need.
TVT: Is ARK Multicasting and it’s 300 future stations a closed model, or are you open to working with affiliate broadcasters to expand your network footprint further?
JW: We’ve had multiple conversations with multiple different groups that represent hundreds of stations that want ARK to be resellers of their bits. Those requests have come from multiple full-power broadcasters and low-power broadcasters.
They are signs to us that we’re on to something that could very well be a situation in which we sell the bits for all of them, and everybody benefits. However, I’m content to work with others. If somebody else is further along and has a better solution. I’m not naïve enough to say, “It’s my way or the highway.” In my opinion, we’re all going to benefit, the faster and better we can get this commercialization lift off the ground.