Graph news

Zakhar Kogan
4 min readSep 3, 2023

Cold water for AI

Growth rate of key innovations

You know what happens in cold water? Everything shrinks.

Bing’s market share hasn’t grown at all. Bing’s share of search It’s still stuck at a lousy 3%.

So did traffic to ChatGPT and other LLMs. What could be the possible reasons?

  • People realize it’s no magic bullet — rather prone to hallucinations etc. All this despite OpenAI and OSS pushing out things like AI agents, interactive fiction and multimodal models for healthcare
  • Again, a realization the AI revolution won’t take weeks, but months/years, if at all. Despite that, people are losing jobs and engaging in so-called ‘sweatshops’ — often it’s the only possible income source. Yet still they are faced with platform bans, below-minimum-wage payouts and exploitation. It’s a social issue I feel deserves outcrying.
  • Personally, I see it as a combination of trying to hastily pump out deterministic outputs from a non-deterministic system + a one-armed bandit we’re relentlessly pushing the lever of, because each time we’re getting an output which is approximately needed and plausible — yet all the time it’s not the jackpot
AI AI AI AI… AI!

So, consumer demand is shrinking; with companies adopting the useful parts — the ones allowing them to run cheaper, extract more revenue — the usual. Remind something?

Hmmm…

Yeah, too for me. Are there any ways forward/around/inside? I believe yeah, one possibility lying in reasoning.

Graph Machine Learning @ ICML 2023

Graphs + ML = ❤️

SE(3) Diffusion

Some of the papers’ topics:

  • New GNN architectures: novel ideas of dynamically rewired message passing with delay and slow nodes for long-range tasks
  • Generative Models: e.g. diffusion models for graph generation (think Stable Diffusion that makes graphs)
  • Molecules & proteins: generating molecules, protein description (by its structure), predicting high resolution mass spectra (in 19 min instead of 126 hours — x398 speedup)
  • Knowledge: knowledge graph embedding (‘packing’), discovering both unseen entities and relations

Neurosymbolic AI recordings & excerpts

By Knowable Magazine

Day 1. Crash Course in Neuro-Symbolic AI (Aug 29, 2023)

Introduction

Intro to Practical Knowledge Representation and Logic + State-of-the-Art Practical Reasoning and Meta-Logic (hour 2)

Logic with probabilities

From Probabilistic Logics to Neurosymbolic AI

Reasoning with large language models

Robust Logic: Past, Present and Future

Compositional generalization

Panel

Day 2: Diverse Approaches at the Research Frontier of Neurosymbolic AI (Aug 30, 2023)

Some extensions and applications of Robust Logic

Deep Learning with Logical Requirements

Neuro-vector symbolic architectures

AI can learn from data. But can it learn to reason?

Thinking fast and slow in AI planning

Utilizing knowledge in compositional generalization

Causal abstraction for faithful, human-interpretable model explanations

Model-based ML: Towards causal reasoning in an AI Scientist

Panel

NSAI Toolkit

Source

A huge repository of IBM-developed neurosymbolic AI OSS, divided into 8 categories:

  • Logical Neural Network (LNN)
  • Natural language processing via reasoning (NLP)
  • Knowledge foundation (KF)
  • Learning with less (LwL)
  • Knowledge augmented sequential decision making (SDM)
  • Human in the loop (HIL)
  • Datasets and environments (DS)
  • Related advances (RA)

Plus a nice taxonomy of neurosymbolic systems, which I won’t include for brevity.

Welcome to Teleogenic/Boi Diaries❣️

Other places I cross-post to:

Reuse

MIT

Citation

BibTeX citation:

@online{kogan2023,
author = {Kogan, Zakhar},
title = {Graph News},
date = {2023-09-03},
url = {https://teleogenic.com/posts/230903-graphs},
langid = {en}
}

For attribution, please cite this work as:

Kogan, Zakhar. 2023. “Graph News.” September 3, 2023. https://teleogenic.com/posts/230903-graphs.

--

--