`
leonzhx
  • 浏览: 786250 次
  • 性别: Icon_minigender_1
  • 来自: 上海
社区版块
存档分类
最新评论
阅读更多

1.  The Maximum Cut Problem

    --  Input : An undirected graph G = (V , E).

    --  Goal : A cut (A , B) -  a partition of V into two non-empty sets that maximizes the number of crossing edges.

    --  Fact : NP-complete.

    --  Computationally tractable special case : Bipartite graphs (i.e., where there is a cut such that all edges are crossing.) Can be solved in linear time via breadth-first search(odd level and even level in different set)

 

2.  A Local Search Algorithm

    --  Notation: For a cut (A , B) and a vertex v, define :

        -- cv(A , B) = # of edges incident on v that cross (A , B)

        -- dv(A , B) = # of edges incident on v that don't cross (A , B)

    --  Local search algorithm:

        -- Let (A , B) be an arbitrary cut of G.

        -- While there is a vertex v with dv(A , B) > cv (A , B):

            -- Move v to other side of the cut

           [key point: increases number of crossing edges by dv(A , B) - cv(A , B) > 0]

        -- Return final cut (A , B)

    --  Running time: Terminates within C(n , 2) ( max number of edges) iterations. [ each iteration will increase # of crossing edges, and max crossing edges are # of all edges)

    --  Performance Guarantees : This local search algorithm always outputs a cut in which the number of crossing edges is at least 50% of the maximum possible. (Even 50% of |E|)

 

3.  Expected number of crossing edges of a random cut already is 1/2|E|.

    -- Proof: Consider a random cut (A , B). For edge e in E, define

        Xe = 1 if e crosses (A , B) , 0 otherwise. 

        We have E[Xe]=Pr[Xe = 1]=1/2.

        So E[# crossing edges]=E[Sum(e in E){Xe}]= Sum(e in E) {E[Xe]}=|E|/2.

 

4.  Proof of Performance Guarantee

    -- Let (A , B) be a locally optimal cut. Then, for every vertex v, dv (A , B) <= cv(A , B). 

    -- Summing over all v in V:

       Sum(v in V){dv(A , B) <= Sum(v in V){cv(A , B)}

       [counts each non-crossing edge twice counts each crossing edge twice]

    -- So:

       2[# of non-crossing edges] <= 2[# of crossing edges]

       2|E| <= 4[# of crossing edges]

       [# of crossing edges] >= 1/2 |E|

 

5.  The Weighted Maximum Cut Problem

    -- Generalization: Each edge e in E has a nonnegative weight we ,

    -- want to maximize total weight of crossing edges.

    -- Notes:

        -- Local search still well defined

        -- Performance guarantee of 50% still holds for locally optimal cuts (also for a random cut)

        -- No longer guaranteed to converge in polynomial time

 

6.  Neighborhoods

    -- Let X = set of candidate solutions to a problem.

       Examples: Cuts of a graph, TSP tours, CSP variable assignments

    -- Key ingredient: Neighborhoods

       For each x in X, specify which y in X are its "neighbors"

       Examples: x, y are neighboring cuts <==> Differ by moving one vertex

                 x, y are neighboring variable assignments <==> Differ in the value of a single variable

                 x, y are neighboring TSP tours <==> Differ by 2 edges

 

7.  A Generic Local Search Algorithm

    --  Let x = some initial solution.

    --  While the current solution x has a superior neighboring solution y: Set x := y

    --  Return the final (locally optimal) solution x

 

8.  FAQ of Local Search Algorithm

    --  How to pick initial solution x?

        -- Use a random solution as an initial solution. Run many independent trials of local search, return the best locally optimal solution found.

        -- Use your best heuristics to find an initial solution and use local search as a postprocessing step to make your solution even better.

    --  If there are superior neighboring y, which to choose?

        -- Choose at random 

        -- biggest improvement

        -- more complex heuristics

    --  How to define neighborhoods?

        -- bigger neighborhoods (distance from neighbors is big) ==> slower to verify local optimality, but fewer (bad) local optima

        -- Find "sweet spot" between solution quality and efficient searchability.

    --  Is local search guaranteed to terminate (eventually)?

        -- If X is finite and every local step improves some objective function, then yes.

    --  Is local search guaranteed to converge quickly?

        -- Usually not.

    --  Are locally optimal solutions generally good approximations to globally optimal ones?

        -- No. [To mitigate, run randomized local search many times, remember the best locally optimal solution found]

 

9.  The 2-SAT Problem

    --  Input:

        -- n boolean variables x1, x2, ... , xn. (Can be set to TRUE or FALSE)

        -- m clauses of 2 literals each ("literal" = xi or !xi )

           Example: (x1 v x2) ^ (!x1 v x3) ^ (x3 v x4) ^ (!x2 v !x4)

    --  Output: "Yes" if there is an assignment that simultaneously satisfies every clause, "no" otherwise.

           Example: "yes", via (e.g.) x1 = x3 =TRUE and x2 = x4 =FALSE

 

10.  Papadimitriou's 2-SAT Algorithm

    --  Repeat log2 n times:

        -- Choose random initial assignment

        -- Repeat 2n^2 times:

            -- If current assignment satisfies all clauses, halt + report this

            -- Else, pick arbitrary unsatisfied clause and flip the value of one of its variables [choose between the two uniformly at random]

    --  Report "unsatisfiable"

    --  Obvious good points:

        -- Runs in polynomial time

        -- Always correct on unsatisfiable instances    

    --  Key question: If there's a satisfying assignment, will the algorithm find one (with probability close to 1)?

 

11.  Random Walks

    --  Setup: Initially (at time 0), at position 0.

    --  At each time step, your position goes up or down by 1, with 50/50 probability.

    --  Except if at position 0, in which case you move to position 1 with 100% probability.

    --  For an integer n >= 0, let Tn = number of steps until random walk reaches position n. Then E[Tn] = n^2

        --  Let Zi = number of random walk steps to get to n from i . (Note Z0 = Tn)

        --  Edge cases: E[Zn]=0, E[Z0]=1+E[Z1]

        --  For i in {1, 2, ... , n }, 

            E[Zi] = Pr[go left] E[Zi | go left] + Pr[go right] E[Zi | go right]

                  = 1/2(1+ E[Zi-1]) + 1/2(1+ E[Zi+1])

            Rearranging: E[Zi] - E[Zi+1] = E[Zi-1] - E[Zi] + 2

            while E[Z0]-E[Z1] = 1 , so : E[Z0] = 1 + 3 + 5 + ... + 2n-1 + E[Zn] = n^2

    -- A Corollary : Pr[Tn > 2n^2] <= 1/2.

        -- n^2 = E[Tn] = Sum(k = 0 to 2n^2) {k Pr(Tn = k)} + Sum(k = 2n^2 + 1 to +Infinity) {k Pr(Tn=k)}

                 >= 2n^2 * Pr(Tn > 2n^2) 

 

12.  For a satisfiable 2-SAT instance with n variables, Papadimitriou's algorithm produces a satisfying assignment with probability >= 1 - 1/n.

    --  Fix an arbitrary satisfying assignment a*.

    --  Let at = algorithm's assignment after inner iteration t (t = 0, 1, ... , 2n^2)

    --  Let Xt = number of variables on which at and a* agree. (Xt in {0, 1, ... , n} is a random variable)

    --  Note: If Xt = n, algorithm halts with satisfying assignment a*.

    --  Key point: Suppose at not a satisfying assignment and algorithm picks unsatisfied clause with variables xi , xj .

    --  Consequence of algorithm's random variable flip:

        -- If a* and at differ on both xi & xj , then Xt+1 = Xt + 1 (100% probability)

        -- If a* and at differ on exactly one of xi , xj , then Xt+1 = Xt + 1 (50% probability) or Xt - 1 (50% probability)

    --  The random variables X0, X1, . . . ,X2n^2 behave just like a random walk of the nonnegative integers except that:

        -- Sometimes move right with 100% probability (instead of 50%) [when both variable xi, xj are different between at and a*.]

        -- Might have X0 > 0 instead of X0 = 0

        -- Might stop early, before Xt = n [there may be satisfiable instance other than a*]

    --  Probability that a single iteration of the outer for loop fails to find a satisfying assignment is Pr[Tn >= 2n^2] <= 1/2

        Pr[algorithm fails] = Pr[all log2 n independent trials fail] = (1/2)^log2 n = 1/n.

分享到:
评论

相关推荐

    Stochastic Local Search

    ### Stochastic Local Search (SLS):基础知识与应用 #### 引言 随机化局部搜索(Stochastic Local Search,简称SLS)是一种重要的优化技术,广泛应用于解决组合优化问题。在过去的几十年里,随着计算机科学的发展...

    演化算法博士论文Evolutionary Algorithms with Local Search for Combinatorial Optimization

    该博士论文主要探讨了演化算法(Evolutionary Algorithms, EAs)与局部搜索(Local Search)结合在解决组合优化问题上的方法与策略。组合优化问题是计算机科学与运筹学中的一个关键领域,它关注的是在一组离散的选项...

    PPLS/D: Parallel Pareto Local Search based on Decomposition

    标题中“PPLS/D: Parallel Pareto Local Search based on Decomposition”指的是提出了一种基于分解的并行帕累托局部搜索(PPLS/D)算法,专门用于求解多目标组合优化问题(MCOP)。并行计算和问题分解是提升帕累托...

    3.rar_GA local search_genetic_operator_local search _matlab loca

    标题中的“3.rar_GA local search_genetic_operator_local search _matlab local”暗示了这是一个关于遗传算法(Genetic Algorithm, GA)与局部搜索(Local Search)结合应用的MATLAB实现项目。遗传算法是一种启发式...

    localsearch

    下面将详细介绍关于“localsearch”及其相关知识点。 ### 一、Google Maps JavaScript API Google Maps JavaScript API 是一种用于在网页上嵌入交互式地图的强大工具。通过这个API,开发者可以轻松地将地图集成到...

    人工智能第五章 Local Search 总结

    在人工智能领域,局部搜索(Local Search)是一种常用的方法,用于解决复杂的问题,特别是优化问题。这一章我们将深入探讨局部搜索的概念、方法及其在实际应用中的变体。 5.1 搜索策略的分类 5.1.1 系统搜索与局部...

    Simplified swarm optimization algorithm with local search.txt

    - **Simplified swarm optimization algorithm with local search**:此标题表明该研究提出了一个简化版本的粒子群优化算法,并且结合了局部搜索策略来解决特定问题。这里的“简化”意味着在保持算法基本结构的同时...

    GoogleMap+LocalSearch

    GoogleMap+LocalSearch获取查找的地址,经纬度

    人工智能第6章 Stochastic local search (SLS) 总结

    在人工智能领域,随机局部搜索(Stochastic Local Search,简称SLS)是一种广泛用于解决复杂优化问题的算法策略。它旨在通过在解空间中随机探索来寻找问题的近似最优解,尤其适用于那些具有大量局部最优解的困难问题...

    Greedy_SC.rar_algorithm_greedy_local search _matlab_supply chain

    本项目着重探讨了如何运用贪心算法(Greedy Algorithm)和局部搜索(Local Search)方法来解决这一问题。通过MATLAB编程,我们可以构建一个模拟模型,以提高供应链效率并降低成本。 首先,让我们详细了解一下贪心...

    Stochastic Local Search(局部搜索算法入门的经典书籍).pdf

    《Stochastic Local Search: Foundations and Applications》是由Hoos和Stützle两位在局部搜索领域的重要专家编著的作品,该书为读者提供了一个全面的局部搜索算法综述。局部搜索算法是一种在可能解的集合中通过...

    A Bare Bones Particle Swarm Optimization Algorithm with Dynamic Local Search.pdf

    为了解决PSO算法中所提到的过早收敛和多样性迅速丧失的问题,本文提出了一个具有动态局部搜索(Dynamic Local Search,DLS)策略的裸骨粒子群优化(Bare Bones PSO,BBPSO)算法。在BBPSO中,粒子的下一个位置是从...

    Local search for Boolean Satisfiability with configuration checking and subscore

    This paper presents and analyzes two new efficient local search strategies for the Boolean Satisfiability (SAT) problem. We start by proposing a local search strategy called configuration checking (CC...

    Dynamic local search based immune automatic clustering algorithm and its applications

    在讨论动态局部搜索基于免疫系统的自动聚类算法(Dynamic local search based immune automatic clustering algorithm, DLSIAC)及其应用时,需要首先明确几个核心概念,包括聚类分析、免疫算法以及动态局部搜索策略...

    PSO.rar_Guided local search_Move Over_pso candidate_swarm

    Each particle s movement is influenced by its local best known position but, is also guided toward the best known positions in the search-space, which are updated as better positions are found by ...

    TS.rar_local search _tabu search_ts算法_搜索_算法代码

    禁忌搜索算法源代码,对局部邻域搜索的一种扩展,搜索过程中采用禁忌准则,即不考虑处于禁忌状态的解,标记对应已搜索的局部最优解的一些对象,在进一步迭代搜索中尽量避开这些对象,避免迂回搜索,从而保证对不同的...

    Simulated-annealing-algorithm.zip_matlab local search

    模拟退火算法,是通过赋予搜索过程一种时变且最终趋于零的概率突跳性,从而可有效避免陷入局部极小并最终趋于全局最优的串行结构的优化算法。

Global site tag (gtag.js) - Google Analytics