News, news analysis, and commentary on the latest trends in cybersecurity technology.

Knostic Brings Access Control to LLMs

Led by industry veterans Gadi Evron and Sounil Yu, the new company lets organizations adjust how much information LLMs provide based on the user's role and responsibilities.

Fahmida Y. Rashid, Managing Editor, Features

April 11, 2024

4 Min Read
Laptop with multiple AI application windows.
Source: Deemerwha Studio via Shutterstock

Security startup Knostic is the latest company to address the various challenges organizations face as they adopt generative artificial intelligence (AI) tools. This week Knostic emerged from stealth with $3.3 million in pre-seed funding to bring "need-to-know" access controls to large language models (LLMs).

Enterprises in their AI transformation journey are sprinkling AI capabilities throughout their workflows and processes to boost productivity, reduce costs, and increase efficiency, says Gadi Evron, co-founder and CEO of Knostic. Enterprises are adopting LLMs to build ChatGPT-like enterprise search systems based on their own data sources or by enabling capabilities that are bundled into the applications and platforms they are already using. Data privacy is one of the biggest barriers to AI adoption, Evron says, noting that AI without controls potentially exposes the organization to increased risk, primarily by exposing information to the wrong people.

"How can we curate personalized information and actually give you value — answer with what you need to know instead of just saying stuff?" Evron says.

Access Control for LLMs Is Necessary

With Knostic, employees can access what they need and receive answers that align with the information they require to perform their jobs.

For example, an organization can have a system that answers questions such as features expected in the next product release, the latest sales numbers and revenue figures, bonus structure, due diligence results in a merger-and-acquisition scenario, or the status of an infrastructure project. But not everyone should get the same answer to every question. While the CFO and CTO needs to know the quarterly sales revenue, the marketing intern probably does not, Evron notes.

Knostic's access control engine considers whether the answer is appropriate for the questioner's role; if it is not, it answers with, "I'm sorry, that is confidential information," Evron says. Or instead of just saying no, the system can respond that even though the answer is confidential, the marketing campaigns that the intern worked on boosted sales over the quarter. That's where personalization and curation comes in.

Evron made it a point to emphasize that access control is binary — either the person can have access or they cannot. Knostic's focus on "need to know" makes it possible to provide some information even when the answer is no.

"When we say no, we are not enabling the business," Evron says, noting that providing information in a different format or with related context helps business users more than just being told no. "Once you figure out what you are allowed to know, you can solve DLP [data loss prevention] and IAM [identity access management."

What 'Need to Know' Looks Like

When thinking about access control, organizations need to consider factors such as whether the system is internal or public-facing, whether the data used to generate responses is sensitive, and the role of the person asking the questions, says Sounil Yu, Knostic's co-founder and CTO.

There has been much discussion about how organizations need to build guardrails into AI systems to prevent abuse and not provide answers that could cause harm. However, this approach tend to be one-size-fits-all and doesn't account for a person's specific circumstances, Yu says. Consider how many publicly available chatbots would not provide medical information because they are not medical professionals and should not be used for diagnostics. But if it's a physician trying to access information as part of an investigation, that particular restriction is not helpful. Access control, unlike guardrails, take into account factors such as time, sensitivity of data, and person's role to determine how to shape the answers.

For example, a company may have a customer service chatbot that troubleshoots and assists in fixing common issues. That chatbot will have access to the same internal knowledgebase articles that the customer service representative would have. But what happens if there is information about a product that is not yet available on the market? The customer service representative needs that information to be ready when the product is available and even beforehand for training purposes. But there could be a lot of problems for that company if the customer learns details about the product from the chatbot before launch.

Instead of creating two systems — one for internal use and the other public-facing — the company can conceivably use Knostic's approach to provide different answers to the customer and to the customer service representative.

Company Details

Evron and Yu have deep industry expertise. Evron founded cyber deception company Cymmetria and previously held roles at Citibank and PwC. Yu is the former chief security scientist at Bank of America and former CISO and head of research at JupiterOne.

Knostic, founded in 2023, has raised $3.3 million in pre-seed financing from Shield Capital, Pitango First, DNS Ventures, Seedcamp, and several angel investors. Retired Admiral Mike Rogers, former head of the National Security Agency and a member of Knostic's advisory board, said in a statement that the startup will "unlock LLMs for enterprises."

Knostic has customers across a range of industries, including retail and financial services. It is also one of the top three finalists for the 2024 RSA Conference Launch Pad, where founders of companies incorporated for two years or less get to pitch ideas and products "on the cusp of being the next big thing" to a panel of venture capitalists. This year's Launch Pad will be on Tuesday, May 7.

About the Author

Fahmida Y. Rashid

Managing Editor, Features, Dark Reading

As Dark Reading’s managing editor for features, Fahmida Y Rashid focuses on stories that provide security professionals with the information they need to do their jobs. She has spent over a decade analyzing news events and demystifying security technology for IT professionals and business managers. Prior to specializing in information security, Fahmida wrote about enterprise IT, especially networking, open source, and core internet infrastructure. Before becoming a journalist, she spent over 10 years as an IT professional -- and has experience as a network administrator, software developer, management consultant, and product manager. Her work has appeared in various business and test trade publications, including VentureBeat, CSO Online, InfoWorld, eWEEK, CRN, PC Magazine, and Tom’s Guide.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights