The application of the asymmetric Lovasz Local Lemma requires finding a weight function $x$ on the bad events satisfying the property in the link. Often, one uses a constant weight function, giving rise to symmetric LLL. Suppose this fails. Is there a systematic way to search for a better choice of $x$?
2026-03-26 13:02:20.1774530140
How to determine the weights in the asymmetric Lovasz Local Lemma
405 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in COMBINATORICS
- Using only the digits 2,3,9, how many six-digit numbers can be formed which are divisible by 6?
- The function $f(x)=$ ${b^mx^m}\over(1-bx)^{m+1}$ is a generating function of the sequence $\{a_n\}$. Find the coefficient of $x^n$
- Name of Theorem for Coloring of $\{1, \dots, n\}$
- Hard combinatorial identity: $\sum_{l=0}^p(-1)^l\binom{2l}{l}\binom{k}{p-l}\binom{2k+2l-2p}{k+l-p}^{-1}=4^p\binom{k-1}{p}\binom{2k}{k}^{-1}$
- Algebraic step including finite sum and binomial coefficient
- nth letter of lexicographically ordered substrings
- Count of possible money splits
- Covering vector space over finite field by subspaces
- A certain partition of 28
- Counting argument proof or inductive proof of $F_1 {n \choose1}+...+F_n {n \choose n} = F_{2n}$ where $F_i$ are Fibonacci
Related Questions in DISCRETE-MATHEMATICS
- What is (mathematically) minimal computer architecture to run any software
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- The function $f(x)=$ ${b^mx^m}\over(1-bx)^{m+1}$ is a generating function of the sequence $\{a_n\}$. Find the coefficient of $x^n$
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Given a function, prove that it's injective
- Surjective function proof
- How to find image of a function
- Find the truth value of... empty set?
- Solving discrete recursion equations with min in the equation
- Determine the marginal distributions of $(T_1, T_2)$
Related Questions in PROBABILISTIC-METHOD
- Proof by probabilistic method for coins on chessboard puzzle
- Proving lower bound of row/column occurencies of numbers in 2D n-element grid
- Translating an Arithmetic Progression in $\mathbf Z/p\mathbf Z^*$, how much Overlap is Possible?
- Clique numbers and Theorem 4.5.1 in "The Probabilistic Method" by Alon and Spencer
- $\frac{N}{n}$; probability
- Why a randomly chosen element of a class satisfying property `p`, implies existence of an element of said class that satisfies `p`?
- Expected number of hamiltonian paths in a tournament
- Finding threshold for Erdos-Renyi random graph to be connected using branching process
- Theorem of Hardy & Ramanujan - second moment Method
- Deviating from the mean +-1 variables
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
In practice, it is not really done systematically. You just recognize a pattern that you've seen in previous applications of the local lemma, or else be very clever. Two patterns that I know:
In addition, there's also a "compromise local lemma" that's asymmetric but doesn't require you to choose weights. It says that we can avoid bad events with positive probability if, for each event $A$, $$\sum_{B \in \Gamma(A)} \Pr[B] \le \frac14.$$ We get this by choosing $x(A) = 2 \Pr[A]$ in the usual version of the LLL. The symmetric version is pretty clearly a special case.
The entropy compression approach to the local lemma may give some intuition for what's going on with the weights. Let me summarize the idea (you can read more about it here); I apologize for the length of the digression before we can get back to answering your question.
Say that we're trying to find a function $f : X \to Y$ satisfying conditions of the form "on a subset $S \subseteq X$, $f$ doesn't do this highly specific thing". It's traditional to think of this as a $Y$-coloring of the set $X$.
Our algorithm will be as follows. Start with $X$ completely uncolored. Repeat the same step over and over:
After $t$ steps, either $X$ has been colored, or we've turned a sequence of colors from $Y^t$ into a partial coloring of $X$ plus a record of what happened on each step. Reversibility guarantees that two different inputs (elements of $Y^t$) result in two different outputs.
If the record can be made terse enough that its length grows as $C^t$ for $C < |Y|$, then there is a $t$ such that, by the pigeonhole principle, there are fewer than $|Y|^t$ possible partial colorings plus records; in that case, it must be possible for the algorithm to terminate - for $X$ to end up entirely colored.
The entropy compression method is (more or less) equivalent to the local lemma. Instead of choosing values $x(A)$, we're choosing ways to encode the description "we uncolored the set $S$ determining $A$; here's how to undo that". These roughly correspond to each other in the sense that if we take $b(A)$ bits to encode this description, then probably something like $x(A) = 2^{-b(A)}$ will work as a weight in the local lemma.
(Intuitively, this is because $x(A)$ is an upper bound for the conditional probability of $A$ happening given that no other bad events happened; $\log_2 \frac{1}{x(A)}$ measures the information content of an event with probability $x(A)$, so there should be an encoding to match.)
The encoding of the description in the entropy compression algorithm is still not something we can really find systematically: it's as hard as finding any other good lossless data compression scheme. But you have some idea of how much you can afford to spend on encoding an event based on how complicated it is - so we're less stumbling in the dark here.