{
  "@context": "https://w3id.org/codemeta/3.0",
  "@type": "SoftwareSourceCode",
  "name": "Claim Verification: \u201cLet (X,Y) ~ p(x,y). For each contrastive training instance, sample one positive Y_1 ~ p(y|X) and N-1 negatives Y_2,...,Y_N iid ~ p(y), conditionally independent given X. Let i* be the index of the positive, uniformly distributed over {1,...,N}. For any measurable scoring function s(x,y), define L_N(s) = - E[ log( exp(s(X,Y_{i*})) / sum_j exp(s(X,Y_j)) ) ]. Then log N - L_N(s) is a lower bound on I(X;Y). The Bayes-optimal score is s*(x,y) = log(p(y|x)/p(y)) + c(x), where c(x) is arbitrary. Under the standard multi-sample setup, the resulting InfoNCE lower bound tightens as N increases.\u201d \u2014 Proved",
  "description": "Verdict: PROVED",
  "version": "1.23.0",
  "dateCreated": "2026-04-18",
  "license": "https://spdx.org/licenses/MIT",
  "codeRepository": "https://github.com/yaniv-golan/proof-engine",
  "url": "https://proofengine.info/proofs/infonce-lower-bounds-mutual-information/",
  "author": [
    {
      "@type": "Organization",
      "name": "Proof Engine"
    }
  ],
  "identifier": "https://doi.org/10.5281/zenodo.19645782"
}