Ontological engineers, often referred to as “ontologists” or “OEs”, teach our AI about the world. The job involves using conceptual analysis, research, programming, and logic to create principled and practical applications. For a given domain of knowledge, ontologists must identify key concepts and objects, as well as the relationships between them. Then they represent these in a structured language—at Cycorp that is CycL—which uses a form of higher order logic.

Here’s a quick example, avoiding technical details. Suppose that we want to teach Cyc a bit about how things can be created. To do this, OEs will first represent the relationship in Cyc; let’s call it createdBy. Then, we have to say what sorts of things can stand in the createdBy relationship: a particular agent can create a particular thing. But we can be more specific, as there are many ways to create things. Individual animals create their offspring, carpenters create boats, George Lucas created Star Wars, and Apple created the iPhone. This suggests that we may have a whole cluster of similar relations, like offspringOf, builtBy, authorOfConceptualWork, and inventingCompany. In addition to developing the language to express how creating works, we can deduce a lot of knowledge from any given creation relationship:

Creators (partly) cause their creations to come into existence.
Temporally, creators exist before their creations come into existence, and creators cannot cease to exist until after their creations come into existence.
Creators bear (partial) responsibility for the things they create: they may profit from or sustain costs from the consequences of their creation.
Creators of physical things must come into physical contact with their created things.
And so on. Hopefully you can see that understanding a basic concept like creation involves seeing many connections to other concepts. In everyday speech, when someone says, “I built a boat,” they expect you to know that the boat didn’t exist before they built it and that they might deserve some praise for doing such a complicated thing. They also expect you to know a lot about boats: it would be silly for you to ask if the boat was in their pocket, or if they made it out of concrete. Instead, you should ask if it’s a sailboat, or if they plan to go fishing in it, or what sorts of wood or metals went into the construction. Ontologists develop and connect the knowledge base, the foundation of Cyc’s knowledge, in ways that facilitate AI understanding the world, rather than just hard-coding answers to bespoke, one-off questions.

Ontologists have a good deal of freedom in terms of which projects they work on. Since Cycorp focuses its efforts in health, energy, and technology, you will likely spend time learning about and teaching Cyc about those topics. Additionally, OEs contribute to internal projects of generally increasing Cyc’s knowledge, which can be almost anything: cultural celebrations (like Thanksgiving dinners, Diwali festivals, etc.), theories of causation, chemistry, interpreting the historical significance of events, or whatever knowledge you are specially equipped to teach Cyc about.

OEs at Cycorp are highly educated, trained, and valued employees that divide the world into understandable pieces for our AI. Without a good ontology, AI will not have the knowledge to be intelligent.

Job Requirements
Successful ontologist candidates must…

…be fluent in formal logic: Ontologists need to convert English into CycL. As a very basic example: if Mxy stands for “x is y’s mother” (and the domain of quantification is everyone), could you use formulas of first order logic (with identity) to express: 1) everyone has a mother, 2) no one has more than one mother, 3) some mother has at least two children, 4) the mother relation is asymmetric, irreflexive, and transitive. Are the above four statements actually true? Do analogous statements hold for any other relationships (e.g. “sibling of”, “grandparent of”)? If you can breeze through answering these questions at a brisk, conversational pace, that’s a good sign that you have the sort of logical fluency that we are looking for.

…balance theoretical and practical virtues: On the one hand, ontologists must be principled in the objective of giving Cyc a complete and accurate representation of the world. On the other hand, successful ontologists balance this with the fact that clients find our knowledge useful only insofar as it quickly and efficiently delivers insights that benefit those clients.

…thrive in small teams and independently: Cycorp is an agile company. We work on many projects simultaneously, typically in teams of three to ten people. Employees may work on more than one project at a time. You must be a good communicator, able to shift gears quickly, and self-motivated.

Job Posting
Job Purpose
For the last 35 years, our 50-person team has been building Cyc, the world’s only AI that deeply understands what it’s reasoning about (as opposed to what everyone else today calls “AI”: statistical machine learning and limited-logic knowledge-graph searching). Human-level artificial general intelligence only needs to be created once, and we’re looking for people to join our team to help us in that effort. As an Ontological Engineer, you will find and fill gaps in Cyc’s knowledge about common sense, context, and how the world works. You must be exceptionally good at introspecting and articulating what you know and why you believe it. You must also be able to express those things formally as sentences in predicate calculus, the representation language that our AI understands. You should enjoy the challenge of thinking about multiple reasoning paths by which an answer could have been reached and engaging in pro-con reasoning.

What it’s like to work as an OE team member
In addition to codifying domain-independent knowledge, many of our OE’s work on one or two particular Cyc applications and must grasp new concepts quickly in the areas of healthcare, energy, and technology.
Most of the work happens in teams, so OE’s need to be able to communicate effectively and work collaboratively with their colleagues.
For some applications, speed matters, and in those cases OE’s will tinker and experiment with redundant representations and reasoning heuristics that enable Cyc to find the same answers but much more efficiently.

Part of the OE’s responsibility is to create regression tests to notice when Cyc stops getting some things right (e.g., due to dissonant new things it’s been taught).
There is no one right way to represent things: the job involves judgement and taste, not just objective analysis; OE’s are as much artists as they are engineers.
Required Qualifications
You must be fluent in translating in both directions between English sentences and logically equivalent sentences in predicate calculus. No particular degree or training is required; some of our OE’s never graduated high school, but most of them have their Ph.D., many in Philosophy with competencies in symbolic logic.
You must be self-motivated. Cycorp emphasizes having great and creative employees over having a rigid structured environment.
As mentioned above, your job as an OE is, roughly, to teach Cyc how to think. So you must be able to hypothesize and articulate plausible reasoning paths. You might do this by introspecting on how you would reason, or by interrogating subject matter experts to gain insight into the way they think about things.
Preferred Qualifications
We prefer, but do not require, candidates with interest or previous experience in artificial intelligence. That said, the logical and reasoning skills are vastly more important than AI-specific training.
Ontologists may benefit from experience and training in computer science and/or programming, especially in LISP.
Ideal candidates will have domain knowledge in at least one of our primary verticals: healthcare, energy, or technology.
Cycorp has a dedicated effort towards natural language understanding, and as such is interested in candidates with a background in linguistics and/or natural language representation.
Client-facing experience (sales and/or technical support) facilitates better product development and subject matter expert interactions.
Technical project management experience, especially in an agile environment, is a plus.
Cycorp’s AI R&D is headquartered in Austin, Texas, and applications are understood to be for regular, full time employment on-site here. Cycorp is an equal opportunity employer. We conform to all the laws, statutes, and regulations concerning equal employment opportunities and affirmative action. We strongly encourage women, minorities, individuals with disabilities and veterans to apply to all of our job openings. We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, gender identity, or national origin, age, disability status, Genetic Information & Testing, Family & Medical Leave, protected veteran status, or any other characteristic protected by law. We prohibit retaliation against individuals who bring forth any complaint, orally or in writing, to the employer or the government, or against any individuals who assist or participate in the investigation of any complaint or otherwise oppose discrimination. Cycorp will hire only persons authorized to work in the United States and will verify identity and eligibility for employment, and complete Form I-9 for all new employees on the date of hire.