Knowledge

Local search (optimization)

Source đź“ť

25: 250:
Most problems can be formulated in terms of a search space and target in several different ways. For example, for the traveling salesman problem a solution can be a route visiting all cities and the goal is to find the shortest route. But a solution can also be a path, and being a cycle is part of
310:
or incomplete algorithm because the search may stop even if the current best solution found is not optimal. This can happen even if termination happens because the current best solution could not be improved, as the optimal solution can lie far from the neighborhood of the solutions crossed by the
262:
solution; a neighborhood being the set of all potential solutions that differ from the current solution by the minimal possible extent. This requires a neighborhood relation to be defined on the search space. As an example, the neighborhood of vertex cover is another vertex cover only differing by
263:
one node. For Boolean satisfiability, the neighbors of a Boolean assignment are those that have a single variable in an opposite state. The same problem may have multiple distinct neighborhoods defined on it; local optimization with neighborhoods that involve changing up to
328:
They hypothesize that local search algorithms work well, not because they have some understanding of the search space but because they quickly move to promising regions, and explore the search space at low depths as quickly, broadly, and systematically as possible.
301:
Local search does not provide a guarantee that any given solution is optimal. The search can terminate after a given time bound or when the best solution found thus far has not improved in a given number of steps. Local search is an
290:. This local-optima problem can be cured by using restarts (repeated local search with different initial conditions), randomization, or more complex schemes based on iterations, like 274:
Typically, every candidate solution has more than one neighbor solution; the choice of which one to select is taken using only information about the solutions in the
560: 282:
search. When the choice of the neighbor solution is done by taking the one locally maximizing the criterion, i.e.: a greedy search, the metaheuristic takes the name
306:
algorithm; it can return a valid solution even if it's interrupted at any time after finding the first valid solution. Local search is typically an
324:
coverage: how systematically the search covers the search space, the maximum distance between any unexplored assignment and all visited assignments.
553: 666: 97:
problems. Local search can be used on problems that can be formulated as finding a solution that maximizes a criterion among a number of
532:: Algorithmics for Hard Problems: Introduction to Combinatorial Optimization, Randomization, Approximation, and Heuristics (Springer) 546: 499: 569: 470:
D. Schuurmans and F. Southey. Local search characteristics of in- complete SAT procedures. AI J., 132(2):121–150, 2001.
620: 407: 137: 68: 46: 314:
Schuurman & Southey propose three measures of effectiveness for local search (depth, mobility, and coverage):
39: 518:
Vijay Arya and Naveen Garg and Rohit Khandekar and Adam Meyerson and Kamesh Munagala and Vinayaka Pandit, (2004):
485: 198: 454: 433: 377: 108:
Local search algorithms are widely applied to numerous hard computational problems, including problems from
321:
mobility: the ability to rapidly move to different areas of the search space (whilst keeping the cost low);
180: 635: 275: 259: 234:
problems for which local search offers the best known approximation ratios from a worst-case perspective
615: 187: 351: 231: 94: 213: 201:, in which a candidate solution is a truth assignment, and the target is to maximize the number of 33: 105:) by applying local changes, until a solution deemed optimal is found or a time bound is elapsed. 101:. Local search algorithms move from solution to solution in the space of candidate solutions (the 661: 630: 346: 307: 221: 113: 605: 585: 50: 515:
and Stutzle, T. (2005) Stochastic Local Search: Foundations and Applications, Morgan Kaufmann.
294:, on memory, like reactive search optimization, on memory-less stochastic modifications, like 600: 595: 291: 205:
satisfied by the assignment; in this case, the final solution is of use only if it satisfies
194:
containing all nodes of the graph and the target is to minimize the total length of the cycle
147:
for a local search algorithm, gradient descent is not in the same family: although it is an
191: 172: 8: 640: 610: 590: 519: 436:
takes steps along the axes of the search-space using exponentially decreasing step sizes.
417: 413: 366: 295: 152: 121: 580: 98: 529: 495: 303: 535:
Wil Michiels, Emile Aarts, Jan Korst: Theoretical Aspects of Local Search (Springer)
403: 383: 255: 238: 148: 144: 109: 82: 538: 491: 512: 237:
The Hopfield Neural Networks problem involves finding stable configurations in
202: 129: 655: 625: 423: 361: 341: 287: 283: 156: 176: 427: 396: 372: 125: 117: 217: 286:. When no improving neighbors are present, local search is stuck at a 505: 183:, and the target is to find a solution with a minimal number of nodes 90: 254:
A local search algorithm starts from a candidate solution and then
227: 483: 133: 159:
rather than an explicit exploration of the solution space.
484:
Battiti, Roberto; Mauro Brunato; Franco Mascia (2008).
167:
Some problems where local search has been applied are:
132:. Examples of local search algorithms are WalkSAT, the 395:
Several methods exist for performing local search of
568: 267:components of the solution is often referred to as 134:2-opt algorithm for the Traveling Salesman Problem 653: 318:depth: the cost of the current (best) solution; 216:where a solution is an assignment of nurses to 554: 410:and an exponentially decreasing search-range. 143:While it is sometimes possible to substitute 487:Reactive Search and Intelligent Optimization 390: 561: 547: 369:(suited for either local or global search) 278:of the current assignment, hence the name 16:Method for problem solving in optimization 69:Learn how and when to remove this message 382:Reactive search optimization (combining 93:method for solving computationally hard 32:This article includes a list of general 654: 524:-Median and Facility Location Problems 542: 230:clustering problem and other related 357:Fields within local search include: 18: 667:Optimization algorithms and methods 13: 526:, SIAM Journal of Computing 33(3). 38:it lacks sufficient corresponding 14: 678: 621:Infinite-dimensional optimization 430:surrounding the current position. 337:Local search is a sub-field of: 220:which satisfies all established 23: 570:Major subfields of optimization 477: 426:searches locally by sampling a 464: 447: 245: 199:Boolean satisfiability problem 1: 440: 378:Late acceptance hill climbing 157:objective function’s gradient 138:Metropolis–Hastings algorithm 520:Local Search Heuristics for 386:and local search heuristics) 7: 636:Multiobjective optimization 332: 190:, in which a solution is a 175:, in which a solution is a 162: 10: 683: 616:Combinatorial optimization 188:traveling salesman problem 576: 416:searches locally using a 406:searches locally using a 391:Real-valued search-spaces 214:nurse scheduling problem 631:Constraint satisfaction 347:Stochastic optimization 114:artificial intelligence 53:more precise citations. 606:Stochastic programming 586:Fractional programming 601:Nonlinear programming 596:Quadratic programming 292:iterated local search 288:locally optimal point 408:uniform distribution 173:vertex cover problem 641:Simulated annealing 611:Robust optimization 591:Integer programming 455:"12LocalSearch.key" 418:normal distribution 414:Random optimization 367:Simulated annealing 296:simulated annealing 122:operations research 99:candidate solutions 581:Convex programming 155:, it relies on an 153:local optimization 649: 648: 501:978-0-387-09623-0 232:facility location 79: 78: 71: 674: 563: 556: 549: 540: 539: 509: 504:. Archived from 471: 468: 462: 461: 459: 451: 384:machine learning 239:Hopfield network 149:iterative method 145:gradient descent 110:computer science 83:computer science 74: 67: 63: 60: 54: 49:this article by 40:inline citations 27: 26: 19: 682: 681: 677: 676: 675: 673: 672: 671: 652: 651: 650: 645: 572: 567: 530:Juraj HromkoviÄŤ 502: 492:Springer Verlag 480: 475: 474: 469: 465: 457: 453: 452: 448: 443: 399:search-spaces: 393: 335: 248: 165: 75: 64: 58: 55: 45:Please help to 44: 28: 24: 17: 12: 11: 5: 680: 670: 669: 664: 662:Metaheuristics 647: 646: 644: 643: 638: 633: 628: 626:Metaheuristics 623: 618: 613: 608: 603: 598: 593: 588: 583: 577: 574: 573: 566: 565: 558: 551: 543: 537: 536: 533: 527: 516: 510: 508:on 2012-03-16. 500: 479: 476: 473: 472: 463: 445: 444: 442: 439: 438: 437: 434:Pattern search 431: 421: 411: 392: 389: 388: 387: 380: 375: 370: 364: 355: 354: 349: 344: 342:Metaheuristics 334: 331: 326: 325: 322: 319: 247: 244: 243: 242: 235: 224: 210: 195: 184: 164: 161: 130:bioinformatics 112:(particularly 77: 76: 31: 29: 22: 15: 9: 6: 4: 3: 2: 679: 668: 665: 663: 660: 659: 657: 642: 639: 637: 634: 632: 629: 627: 624: 622: 619: 617: 614: 612: 609: 607: 604: 602: 599: 597: 594: 592: 589: 587: 584: 582: 579: 578: 575: 571: 564: 559: 557: 552: 550: 545: 544: 541: 534: 531: 528: 525: 523: 517: 514: 511: 507: 503: 497: 493: 489: 488: 482: 481: 467: 456: 450: 446: 435: 432: 429: 425: 424:Random search 422: 419: 415: 412: 409: 405: 402: 401: 400: 398: 385: 381: 379: 376: 374: 371: 368: 365: 363: 362:Hill climbing 360: 359: 358: 353: 350: 348: 345: 343: 340: 339: 338: 330: 323: 320: 317: 316: 315: 312: 309: 308:approximation 305: 299: 297: 293: 289: 285: 284:hill climbing 281: 277: 272: 270: 266: 261: 257: 252: 240: 236: 233: 229: 225: 223: 219: 215: 211: 208: 204: 200: 196: 193: 189: 185: 182: 178: 174: 170: 169: 168: 160: 158: 154: 150: 146: 141: 139: 135: 131: 127: 123: 119: 115: 111: 106: 104: 100: 96: 92: 88: 84: 73: 70: 62: 52: 48: 42: 41: 35: 30: 21: 20: 521: 506:the original 486: 478:Bibliography 466: 449: 404:Luus–Jaakola 394: 356: 352:Optimization 336: 327: 313: 300: 279: 276:neighborhood 273: 268: 264: 253: 251:the target. 249: 206: 177:vertex cover 166: 142: 107: 103:search space 102: 95:optimization 87:local search 86: 80: 65: 56: 37: 428:hypersphere 397:real-valued 373:Tabu search 311:algorithm. 260:neighboring 258:moves to a 256:iteratively 246:Description 222:constraints 126:engineering 118:mathematics 51:introducing 656:Categories 513:Hoos, H.H. 441:References 34:references 91:heuristic 333:See also 228:k-medoid 163:Examples 136:and the 59:May 2015 304:anytime 209:clauses 203:clauses 47:improve 498:  218:shifts 128:, and 36:, but 458:(PDF) 280:local 269:k-opt 192:cycle 181:graph 179:of a 89:is a 496:ISBN 226:The 212:The 197:The 186:The 171:The 151:for 207:all 140:. 116:), 81:In 658:: 494:. 490:. 298:. 271:. 124:, 120:, 85:, 562:e 555:t 548:v 522:k 460:. 420:. 265:k 241:. 72:) 66:( 61:) 57:( 43:.

Index

references
inline citations
improve
introducing
Learn how and when to remove this message
computer science
heuristic
optimization
candidate solutions
computer science
artificial intelligence
mathematics
operations research
engineering
bioinformatics
2-opt algorithm for the Traveling Salesman Problem
Metropolis–Hastings algorithm
gradient descent
iterative method
local optimization
objective function’s gradient
vertex cover problem
vertex cover
graph
traveling salesman problem
cycle
Boolean satisfiability problem
clauses
nurse scheduling problem
shifts

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑