Ontologies: formal representations of knowledge as sets of concepts and relationships, play a pivotal role in artificial intelligence and information systems. They enable machines to interpret human knowledge by structuring concepts in a machine processable way. However, ontologies are not culturally neutral: they embed assumptions from the language and worldview of their creators [1] [2] . An ontology reflects a particular conceptualisation of the world and this conceptualisation is often influenced by the culture and linguistic context in which it was developed. Professionals in knowledge engineering must therefore grapple with how cultural assumptions and linguistic structures shape concept selection, relation patterns and even biases in ontological models. In this post, we explore the interplay between ontologies, culture and language, and the implications for AI and knowledge engineering. Using real-world examples from Friend of a Friend ( FOAF), schema.org al well as domain...
Executive summary Modern scientific, technological, defence and intelligence capability depends disproportionately on cognitive variance associated with neurodivergence. Western societies historically extracted value from such cognition while marginalising contributors through medicalisation, exclusion and late recognition. Eastern societies followed an alternative path, integrating cognitive variance through role alignment and collective discipline, often without diagnostic recognition and at high personal cost. Neither model optimises resilience, wellbeing or long-term capability. Reframing neurodivergence as cognitive infrastructure enables stronger organisational performance, national resilience and competitive advantage. Abstract Modern scientific, technological and security capability rests upon sustained engagement with complexity, abstraction and anomaly detection. Evidence from history organisational practice and labour-market data demonstrates persistent reliance upo...