forked from turingfan/ConnellyGentHamiltonian
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathAA_hard_ham_2012.tex
2201 lines (1871 loc) · 103 KB
/
AA_hard_ham_2012.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
\documentclass[twoside,11pt]{article}
\usepackage{jair, theapa, rawfonts}
%\usepackage{setspace}
\usepackage{amsmath}
\usepackage{amsfonts}
\usepackage{parskip}
\usepackage{algorithmic}
\usepackage{algorithm2e}
\usepackage{graphicx}
\usepackage{subfigure}
\begin{document}
%\title{Numerical Evidence for a Potentially Hard Class of Random Graphs for the Hamiltonian Cycle Problem }
\title{ Numerical Evidence For Hard Random Instances of the Hamiltonian Cycle Problem }
%for Graph Instances Whose Degree Distribution is L\'evy-Stable }
% Numerical
%\author{ Abram Connelly, Ian Gent}
\author{\name Abram Connelly \email abram.connelly@gmail.com\\
\name Ian P. Gent \email ian.gent@st-andrews.ac.uk \\
\addr School of Computer Science, St Andrews University, St Andrews, UK}
% For research notes, remove the comment character in the line below.
%\researchnote
%\date{\today}
\maketitle
\begin{abstract}
Numerical evidence is presented for a phase transition in the probability
of a Hamiltonian cycle existing in graphs whose degree distribution is
chosen from a modified L\'evy-Stable distribution.
Graphs whose degree distribution is chosen to be L\'evy-stable are
presented as a potential ensemble of graphs that are intrinsically difficult
to test for Hamiltonicity.
Numerical evidence is presented
to support this claim by showing hard random instances
exist near the critical threshold.
Finding Hamiltonian cycles in Erd\"os-R\'enyi random graphs is well
known to have almost sure polynomial time algorithms, even near
the critical threshold.
To the knowledge of the authors, the graph ensemble presented is the
first candidate, without specific graph structure built
in, to generate graphs whose Hamiltonicity is intrinsically hard to
determine.
Search cost is highly dependent on the particular algorithm used
and the graph ensemble is presented only as a potential candidate
to generate intrinsically hard graphs that are difficult to test for Hamiltonicity.
%We study Hamiltonian cycles occurring in randomly generated graphs whose degree sequences are
%chosen from the class of approximate L\'evy-Stable distributions.
%We give numerical evidence that graphs generated in this way are difficult to solve. Graphs generated in this
%way do not suffer from the finite variance in the degree distribution that allow Erd\"os-R\'enyi graphs to be
%solved in (a.s.) polynomial time.
\end{abstract}
\section{Introduction}
%%%%%
A Hamiltonian cycle is
a graph traversal that passes through each vertex exactly once and ends
next to where it started.
Given a graph $G = (V, E)$, of vertices $V$ and
edges $E$, $|V|=n$, $|E|=m$, the Hamiltonian cycle problem
asks if a path, $P = (v_0, v_1, ... , v_{n-1})$, exists such that
%$v_i \ne v_j$ for $i \ne j$ and with $(v_i, v_{i+1}) \cup (v_{n-1}, v_0) \in E$ for
$v_i \ne v_j$ for $i \ne j$ and with $(v_i, v_{i+1}), (v_{n-1}, v_0) \in E$ for
$0 \le i < n-1$.
Unless otherwise stated, all graphs are assumed to be simple graphs: Undirected edges,
no self loops and no multiple edges.
%The Hamiltonian cycle problem asks to find a cycle, $(w_0, w_1, \cdots, w_{n-1})$,
%in a graph, $G = (V, E)$, with vertices, $V$, and edges, $E$. The Hamiltonian cycle
%problem is one of Gary and Johnston's canonical NP-Complete problems \cite{garey}.
As an aid in the analysis of algorithms and to gain further insight into the difference
between $P$ and $NP$, so-called phase transitions of NP-Complete
problems have come under investigation \cite{selman,gent,hartmann_weigt_2001,hartmann_weigt,martin,monasson}.
In this context, the phase transition of an NP-Complete problem is a rapid change
in the probability of a randomly generated instance being soluble based on a a ertain
parametrization. For example,
Erd\"os and R\'enyi \citeyear{erdos} proposed a model of random graphs
whose edges are chosen with a probability, $p$, independently and at random. As the probability
is varied for a fixed $n$, the probability of finding a Hamiltonian cycle jumps from 0 to 1 at
a critical point \cite{komlos,posa}
Many other NP-Complete problems undergo a phase transition,
including the number partition problem \cite{gent,mertens,borgs_pt}, 3-SAT \cite{selman}
and graph coloring \cite{cheeseman}, to name a few.
%It is thought that every NP-Complete problem undergoes a phase transition \cite{foo} and may be a
%defining characteristic of NP-Complete problems in general \cite{monasson ???}.
Since phase transitions are so prevalent in other NP-Complete problems, it is likely that
all NP-Complete problems, suitably parametrized, undergo a phase transition in their probability
of solution. The hope is that by analyzing the Hamiltonian cycle as a canonical NP-Complete problem,
insight derived for this class can be applied to other NP-Complete problems.
The phase transition only speaks to the probability of a solution existing and doesn't say anything about
the difficulty of finding a solution. It was initially thought
that ``hard" instances were generated at or near the critical transition point \cite{cheeseman}.
Since then the picture has become more complex.
For the number partition problem, instances appear difficult nearly everywhere,
even very far away from the transition point \cite{gent,gent_npp_an}. For 3-SAT, survey propagation \cite{zecchina} solves randomly
created instances that are soluble extremely close to the transition point while proofs of insolubility
appear to grow exponentially near the other side of the transition point \cite{beame_karp_pitassi_saks,ben_sasson_widgerson,li_gerard}.
The Hamiltonian cycle problem on Erd\"os-R\'enyi random graphs, on the other hand,
have probabilistic algorithms to determine Hamiltonicity in almost sure polynomial time everywhere,
even near the critical point of the phase transition \cite{bollobas_fenner_frieze,angluin,shamir,bollobas}.
%While the Hamiltonian cycle problem displays a phase transition in Erd\"os-R\'enyi random graphs, the probability of finding a solution
%does not produce graphs whose Hamiltonicity is harder than polynomial time to detect.
The Hamiltonian cycle represents an NP-Complete class of problem that is thought to
be exponentially difficult in general
%but whose instance generation yields polynomial
but when graph instances are randomly generated, at least for Erd\"os-R\'enyi random
graphs, the instances are almost surely polynomial time solvable.
%solvable instances, at least for Erd\"os-R\'enyi random graphs.
This represents a conundrum in that there exists an NP-Complete class of problems
that is thought to be exponentially difficult to solve in general, but for which
finding generically hard random instances in practice is difficult.
The hope is that gaining insight into why Hamiltonicity is easy to determine for some random graphs
while difficult for others, will shed light on how to create generically hard instances
for other NP-Complete problems
and will allow us to get a better picture of the boundary
between $P$ and $NP$.
In this paper, a class of random graphs is presented as a candidate distribution for which
random graph instances might be intrinsically hard to determine Hamiltonicity for.
Numerical evidence is provided to suggest that Hamiltonicity is difficult to determine
for graphs whose degree sequence is generated from the
family of L\'evy-Stable distributions.
A slightly modified algorithm used by Culberson and Vandegriend \citeyear{vandegriend}
for small graphs and a randomized
algorithm proposed by P\'osa \citeyear{posa} for large graphs are used to generate run time analysis. As run times
are highly dependent on the algorithms used, the class of graphs are presented
only as a candidate to generate intrinsically hard graphs whose Hamiltonicity is difficult to determine.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\section{Previous Work}
\subsection{Hamiltonian Cycles on Random Graphs}
Erd\"os and R\'enyi \citeyear(erdos) introduced the concepts of random graphs.
Let $n$ be the number of vertices. Choose a probability, $p$, and consider
each potential edge between the $n$ vertices, joining an edge between them
with probability $p$. One can also consider fixing the number of edges, $m$, then choosing
edges to place in the graph by a random uniform choice from the set of ordered pairs of vertices.
These two formulations have minor
differences but, under generally lax conditions for the Hamiltonian cycle problem,
the $G_{n,p}$ ensemble can be replaced by
the $G_{n,m}$ ensemble
by setting $p = m /\binom{n}{2} $ (See \cite{bollobas}, Theorem 2.2).
The two formulations will be used interchangeably as the need arises.
Erd\"os and R\'enyi \citeyear{erdos_60,erdos_61} were first to notice a phase transition for the appearance of the
giant component.
This work
provided the basis for much work done on other qualities of the graph, most notably
%for this thesis,
in this context,
the appearance of a Hamiltonian cycle.
The appearance of a Hamiltonian cycle and the appearance of the giant component
are distinct graph qualities but both of them follow a rapid transition
of appearing as the edge probability parameter, $p$, is increased.
The microscopic quantity that is varied that induces the macroscopic change in state, or phase
change, is usually called the order parameter.
The property that is searched for determines the exact
location of the phase transition point.
Finding the suitable parametrization
for a given NP-Complete class is still an open problem \cite{martin}.
Following Bollob\'as \citeyear{bollobas},
the order parameter for the phase transition of Hamiltonian cycles in
Erd\"os-R\'enyi random graphs is:
$$ m(n) = (n/2)(\ln n + \ln \ln n + c_n ) $$
where $m(n)$ is the number of edges as a function of the number of vertices, $n$,
and $c_n$ is the critical parameter.
Cheeseman, Kanefsky and Taylor \citeyear{cheeseman}
gave numerical results for search cost when running algorithms
on Erd\"os-R\'enyi random graphs and reported an exponential
increase in computational search cost near the critical threshold.
Unfortunately their choice of heuristics turned out to be ill
suited and with better algorithms one can almost surely
determine Hamiltonicity
in Erd\"os-R\'enyi random graphs in polynomial time.
Bollob\'as, Fenner and Frieze \citeyear{bollobas_fenner_frieze} improved on
algorithms presented by Angluin and Valiant \citeyear{angluin} and Shamir \citeyear{shamir}
to create their
HAM algorithm to find Hamiltonian cycles
in Erd\"os-R\'enyi random graphs and show:
$$ \displaystyle\lim_{n \to \infty} P(\text{HAM finds a Ham.\ cycle in $G_{n,m}$ }) =
\left\{ \begin{array}{l l l}
0, & \quad \text{if $c_n \to -\infty$}\\
e^{-e^{-c}}, & \quad \text{if $c_n \to c$}\\
1, & \quad \text{if $c_n \to \infty$}
\end{array} \right.
$$
HAM runs in $o(n^{4+\epsilon})$ time, for some $\epsilon > 0$ giving us an almost sure
polynomial time algorithm to find Hamiltonian cycles in Erd\"os-R\'enyi random
graphs.
In practice one can do much better.
One such optimization is proposed by Culberson and Vandegriend \citeyear{vandegriend}
who constructed a lowest degree first backtracking algorithm
with some additional heuristics to prune the graph as a path is tried.
Section \ref{culb-van} will go into more detail about Culberson and Vandegriend's algorithm and the heuristics used.
%Chapter 3 will go into more detail about Culberson and Vandegriend's
%algorithm and heuristics used.
For a detailed description of other algorithms used for Hamiltonian cycle determination,
the reader is referred to the master's thesis by Vandegriend \citeyear{vandegriend_thesis}.
%A more detailed description of other algorithms used for Hamiltonian cycle determination
%is given in the next subsection.
Culberson and Vandegriend found that hard instances were rarely encountered and became more
infrequent as larger instances of graphs were generated. In this way, the random
Hamiltonian cycle problem for Erd\"os-R\'enyi graphs is one that is theoretically
known to have an efficient solution and has a practically efficient algorithm.
Culberson and Vandegriend pointed out that their algorithm becomes exponential in its
search cost when graph instances are Interconnected-Cutset (ICCS) graphs even though
ICCS graphs are known to have a polynomial time algorithm to verify
Hamiltonicity.
%In this thesis,
In this paper, Culberson and Vandegriend's algorithm has been used
to provide numerical evidence of an increase in search cost for graphs
whose degree
sequence is chosen from a L\'evy-Stable distribution, but this should be taken
in context of the known shortcomings of this particular algorithm.
\subsection{Power Law Degree Distributed Graphs}
Many real world graphs display a power law in their degree distribution, from graphs based
on the Web connectivity \cite{adamic} to social networks \cite{albert,newman}. These
power law degree distributed graphs have a much different characteristic than the Erd\"os-R\'enyi graphs, displaying diverging second moment in their
degree distribution and having a smaller average diameter
\cite{esker_hofstad_hooghiemstra_znamenski,hofstad_hooghiemstra_mieghem,hofstad_hooghiemstra_znamensi}.
The diverging second moment for the degree distribution gives the power law degree distributed graphs a much lower average diameter than
Erd\"os-R\'enyi random graphs. Erd\"os-R\'enyi random graphs have diameter approximately $\ln(n)$ whereas power law degree distributed
random graphs have diameter that is either a constant or $\ln(\ln(n))$ depending on whether the critical exponent $\alpha \in (0, 1)$ or
$\alpha \in (1, 2)$ respectively \cite{esker_hofstad_hooghiemstra_znamenski,hofstad_hooghiemstra_mieghem,hofstad_hooghiemstra_znamensi}.
Bianconi and Marsili \citeyear{bianconi} give analysis
on loops of all lengths on power law degree distributed graphs and show
that small loops are much more common in graphs
%whose second moment
where the second moment
in the degree distribution diverges. Gleiss et all \citeyear{gleiss_stadler_wagner}
confirm this in known real world power law degree distributed graphs
of large metabolic networks to find that triangles are much more common
than in their Erd\"os-R\'enyi counterparts.
Molloy and Reed \citeyear{molloy} introduced the
``Erased Configuration Model" which pairs vertices based on a degree sequence
chosen proportional to the remaining free slots.
Britton, Deijfen and Martin-L\"of \citeyear{britton}
show that the
Erased Configuration Model
asymptotically approaches the prescribed degree distribution chosen.
While not guaranteeing the exact degree sequence input, the erased configuration model is
chosen as the graph generation method
%in this thesis
for its simplicity and speed.
Berger \citeyear{berger_pa} and Chung and Lu \citeyear{chung_complex} discuss other types of graphs and their generation.
Blitzstein and Diaconis \citeyear{diaconis} show a polynomial
time algorithm to determine whether a degree sequence is feasibly realizable as a graph and give an importance sampling method to choose
from the space of possible graphs. From an algorithmic perspective this is near ideal but their
algorithm presented has a worst case run-time of $O(n^2 \bar{d})$, for
$n$ vertices with average degree $\bar{d}$. Blitzstein and Diaconis discuss the average run-time of
their algorithm stating that "... we do not have a better bound on the average running time than the worst case running time ...".
From experimentation by the authors, Blitzstein and Diaconis's algorithm is not fast enough in its average run time to be used for
the quantity of instances used in our analysis.
%needed in this thesis's analysis.
For this reason, Blitzstein and Diaconis's algorithm was not used.
Reed and Hughes \citeyear{reed} give a good discussion on why power laws are so prevalent in nature.
Nolan \citeyear{nolan}, Zolotarev \citeyear{zolotarev}
and Feller \citeyear{feller} give references and further discussions on L\'evy-Stable distributions.
It could be that the spectra of a graph, either from its adjacency matrix or one of its close relatives, such as its Laplacian, is
a better quantitative description of graphs rather than its degree distribution.
%In this thesis,
The degree distribution is used here, but
for further reference on spectral graph theory, see Chung \citeyear{chung_spec} and Chung and Lu \citeyear{chung_complex}.
Farkas, Der\'enyi, Barab\'asi and Viscek \citeyear{farkas} discuss real world graphs and their deviation from Wigner's semi-circle law.
Mihail and Papadimitriou \citeyear{mihail_papadimitriou} give arguments for showing that the spectrum of a graph is highly
tied to its degree distribution and show, for graphs with a degree distribution that is power law, the first few eigenvalues
display power law behavior.
One could imagine constructing a random instance from another NP-Complete problem, transforming it to a Hamiltonian cycle instance via
some reduction, and then analyzing the resulting complexity of the graphs produced. For example, by using a standard reduction from
3-SAT to Hamiltonian cycle, one could generate a random instance in 3-SAT, reduce it to a Hamiltonian cycle and note the difficulty.
This method introduces a problem of adding structures that are high in degree in the resulting instance.
One could run standard algorithms on this reduced graph but most algorithms
are optimized to run on random graphs that make implicit assumptions about how random graphs look locally.
In the above example, one of the standard reductions from 3-SAT to
Hamiltonian cycle constructs the graph by introducing a set of small widgets \cite{cormen}.
These widgets are highly structured
and the resulting graph is highly structured.
Shortcomings in algorithms
with this assumption could be overcome and redesigned to exploit the structure that was introduced
from the reduction but at a cost of a polynomial increase in run time.
In the opinion of the authors,
%In the author's opinion,
the deeper issue is that focusing on graph instances reduced from some other NP-Complete
class obfuscates questions about intrinsic complexity.
Assuming one could perfectly overcome the structure introduced from the reduction, one is left with dealing
with the intrinsic complexity of the underlying distribution used in creating the original instance
while leaving questions unanswered about what distribution or where the intrinsically complex instances are located
in the target NP-Complete class.
%It is the author's opinion that
In the opinion of the authors,
focusing on random graph instances chosen directly from a graph ensemble, rather than a derived one whose
complexity is inherited from the original NP-Complete class, provide more insight into the intrinsic difficulty
of the NP-Complete class under investigation.
\section{Algorithms Used}
\label{culb-van}
\subsection{Graph Generation}
The ``Erased Configuration Model" \cite{molloy}
graph generation algorithm is used
for graph construction.
Speed of graph generation is a concern.
Constructing graphs with the exact degree sequence specified
is of little concern as long as the limiting distribution
is the same.
This is the case for the Erased Configuration Model \cite{britton}.
The algorithm creates hooks for each vertex, where the number of hooks for each vertex is initially given by the degree
sequence. Going from the vertex with the most hooks first, another hook is chosen uniformly at random from the pool
still unattached. A connection is made for this pairing.
For simplicity's sake, vertices are temporarily allowed to have self loops
and multiple edges. Once the pool of hooks is exhausted, self loops are removed and multi-edges are collapsed
into one.
%Algorithm \ref{genapproxdegseq} gives pseudo-code for this process.
%\begin{algorithm}
%\caption{ GenerateApproxDegSequenceGraph }
%\label{genapproxdegseq}
%\KwIn{ int n, int deg[\ ] }
%\KwOut{ Graph G \BlankLine }
%int hook[ $n$ ][\ ] \;
%int nhook=0 \;
%Graph G \;
%sort deg in descending order \;
%\ForEach{ $v$ in $0 \dots (n-1)$ }{
% \ForEach{ $j$ in $0 \dots deg[v] - 1$ } {
% hook[ $v$ ][ $j$ ] = $v$ \;
% }
%}
%\ForEach{ v in $0 \dots (n-1)$ }{
% \While{ hook[ $v$ ] }{
% Choose vertex $u$ uniformly at random from all entries in hook array \;
% Remove a hook for $u$ and a hook for $v$ from the hook array \;
% Add edge $u \sim v$ to G \;
% }
%}
%Collapse multiple edges and remove self loops in G \;
%\Return G \;
%\end{algorithm}
Before proceeding further, L\'evy-Stable distributions will be discussed
as they are the probability distribution used for the degree schedule
of most of the random graphs generated in this paper.
The family of L\'evy-Stable probability distributions are, under some very general
conditions, the convergent distribution of summing independent and identically
distributed random variables \cite{nolan,feller}.
In general, the class of L\'evy-Stable distributions do not admit closed
form solutions and are most often defined in terms of
their characteristic function:
$$ \phi(x) = \exp ( -|\gamma t|^\alpha ( 1 - i \beta \text{sign}(t) \tan(\pi \alpha / 2)) + i \delta t) $$
where $\text{sign}(t)$ returns the sign of $t$, $\alpha$ is the critical exponent,
$\beta$ is the skew, $\gamma$ is the scale and $\delta$ is the shift.
The probability density function
can be given by the Fourier Transform of its characteristic function:
%See Nolan \cite{nolan}, Zolotarev \cite{zolotarev} and Feller \cite{feller} for more
%details on the properties of L\'evy-Stable distributions.
$$ p(x) = \frac{1}{2\pi} \int_{-\infty}^{\infty} dt \ \exp( - i t x - |\gamma t |^\alpha
( 1 - i \beta sign(t) tan(\pi \alpha /2)) + i \delta t ) $$
%\cite{nolan,zolotarev,feller}:
Only for a few
values of $\alpha$ does the L\'evy-Stable distribution reduce to a closed form solution
\footnote{For example, a L\'evy distribution follows
if $\alpha = \frac{1}{2}$, a Cauchy distribution for $ \alpha = 1 $, and a Normal distribution for $ \alpha = 2$. See Nolan \citeyear{nolan}
for more details}.
One can view $\alpha$ as a parameter to set the degree of the power law tail of the distribution.
$\gamma$ is analogous to the variance term for the Normal distribution, though this analogy is not exact \cite{nolan}.
%See Nolan \cite{nolan} for more details.
In Nolan's \citeyear{nolan} notation,
the above corresponds
to a distribution chosen from the $S(\alpha, \beta, \gamma, \delta; 0)$ class of L\'evy-Stable distributions.
%For simplicities sake, only the class of symmetric L\'evy-Stable distributions is considered i.e. $\beta=0$, $\delta=0$:
For simplicity's sake, only the class of symmetric L\'evy-Stable distributions is considered i.e. $\beta=0$, $\delta=0$:
$$ p(x) = \frac{1}{2 \pi} \int_{-\infty}^{\infty} dt \exp( - i t x - \gamma^\alpha | t |^\alpha ) $$
A salient feature is their power law tails. From Nolan \cite{nolan}, for $x \gg 0$:
$$ P( X > x ) \sim \gamma^{\alpha} c_{\alpha} (1 + \beta) x^{-\alpha} $$
$$ p( x ) \sim \alpha \gamma^{\alpha} c_{\alpha} (1 + \beta) x^{ - (\alpha + 1) } $$
Here, $X$ is the L\'evy-Stable random variable drawn from $S(\alpha, \beta, \gamma, \delta; 0)$,
$p(\cdot)$ is the probability distribution function, $P(\cdot)$ is the cumulative distribution
function and $ c_{\alpha} = \sin( \pi \alpha / 2 ) \gamma(\alpha) / \pi $.
For this reason, L\'evy-Stable distributions are often called ``fat-tailed" distributions, with
the $\alpha$ parameter setting the `corpulence' of the tail of the distribution.
An attractive feature of
the L\'evy-Stable distributions is their stability:
Should the sum of independent and identically distributed (i.i.d.)\ random variables (r.v.'s) converge,
then they converge to one parametrization in the L\'evy-Stable family
% the sum of independent and identically distributed (i.i.d.)
%random variables (r.v.'s) converges to one parametrization in the L\'evy-Stable family
\cite{nolan,feller,zolotarev}.
If one starts out with i.i.d.\ r.v.'s drawn from a L\'evy-Stable distribution to begin with, then the resulting sum
will be L\'evy-Stable with the same parametrization save for a rescaling factor.
This can be stated as (again, using Nolan's \citeyear{nolan} notation):
$$ X_k \in S(\alpha, \beta, \gamma, \delta; 0), \ \ \ k \in [0 \dots n-1 ]$$
$$ X_0 + X_1 + \dots + X_{n-1} \sim S(\alpha, \beta, n^{1 / \alpha} \gamma, \delta_*; 0) $$
$$ \delta_* =
\left\{
\begin{array}{ll}
n \delta + \gamma \beta \tan( \pi \alpha / 2) (n^{1/\alpha} - n ), & \alpha \ne 1 \\
n \delta + 2 \gamma \beta n \ln(n) / \pi , & \alpha = 1
\end{array}
\right.
$$
As a reminder, the sum of finite varianced i.i.d.\ r.v.'s
converges to a Normal Distribution.
%, which is indeed in the family of L\'evy-Stable distributions.
Once the finite variance
condition is dropped, the Gaussian is no longer the convergent distribution
and one of the other in the family of L\'evy-Stable distributions %in the family of L\'evy-Stable
is the limiting case for sums of stable random variables.
Under some very general conditions, we are assured
that for the sum of r.v.'s with differing underlying distributions,
one in the class of L\'evy-Stable distributions is likely
to result \cite{feller}.
When needed, the GNU Scientific Library (GSL) is used as it
provides functions for simulating L\'evy-Stable random variables.
To generate graphs with heavy-tails, a truncated L\'evy-Stable distribution is chosen. For simplicity, the L\'evy-Stable
distributions chosen are symmetric and centered around the origin. Only the scale parameter
and critical exponent are varied.
For each vertex, a random draw is taken from this distribution, forced positive and truncated to an integer.
The value is then clamped to be within the minimum and maximum degree, if specified. If none are specified, the
default minimum and maximum values are 0 and $n-1$ respectively.
%Algorithm \ref{levydeg} shows pseudo-code for this process.
%\begin{algorithm}
%\caption{ GenerateLevyDegreeSequence }
%\label{levydeg}
%\SetKwFunction{GenLevyDegSeq}{GenLevyDegSeq}
%\KwIn{ $\alpha$, $\gamma$, mindeg, maxdeg }
%\KwOut{ deg[\ ] \BlankLine }
%\For{ $i \rightarrow 0 $ \KwTo $ n-1 $ }{
% $t$ = $|$ gsl\_ran\_levy($\alpha$, $\gamma$) $|$ \;
% \If{ $t < mindeg$ }{ $t = mindeg$ \; }
% \If{ $t > maxdeg$ }{ $t = maxdeg$ \; }
% deg[ $i$ ] = $t$ \;
%}
%\Return deg \;
%\end{algorithm}
%The \verb!gsl_ran_levy! in Algorithm \ref{levydeg} is a call to the \verb!gsl_ran_levy! function
%in the GNU Scientific Library (GSL).
%The \verb!gsl_ran_levy! function is used to simulate a symmetric L\'evy alpha stable random variable with
%exponent $\alpha$ and scale parameter $\gamma$.
Since graphs under consideration
are less than 1000 vertices, arbitrary precision is not a concern.
\subsection{Complete Hamiltonian Cycle Algorithm}
A slightly modified version the algorithm of Culberson and Vandegriend \citeyear{vandegriend}
is used. This modified Culberson and Vandegriend's algorithm is
a graph traversal algorithm with backtracking. A path is extended, if possible, and the graph is pruned. If
no extension is possible, backtracking occurs.
\begin{figure}
\centering
\includegraphics[scale=1.0]{gv/prune_path.ps}
\caption{ Pruning edges from a partial path through the graph. Bold lines are the partial path. Dashed lines are the
edges that can be pruned. }
\label{fig:prune_path}
\end{figure}
\begin{figure}
\centering
%\includegraphics[scale=0.45]{gv/prune_mid.ps}
\includegraphics[scale=0.4]{gv/prune_mid.ps}
\caption{ Pruning edges from a vertex, $W$, that is centered between two degree 2 vertices, $U$ and $V$. A Hamiltonian cycle
must go through $W$ through $U$ and $V$, so all other edges can be pruned. Edges that can be pruned are dashed in this figure. }
\label{fig:prune_mid}
\end{figure}
\begin{figure}
\centering
%\includegraphics[scale=0.45]{gv/prune_chain.ps}
\includegraphics[scale=0.4]{gv/prune_chain.ps}
\caption{ Chains of degree 2 vertices that don't form a complete circuit, and whose endpoints are connected, can have the edge connecting their endpoints pruned. In this graph, the chain
of degree 2 vertices are $b$ through $h$, with the endpoints $a$ and $i$ connected. The connected edge that can be pruned is dashed. }
\label{fig:prune_chain}
\end{figure}
There are three pruning techniques. The first is
whenever a path is extended, all edges connected to non endpoints of the current path that are not part of the path itself, are removed,
as no Hamiltonian cycle will be able to use those edges. Figure \ref{fig:prune_path} shows
an example of this pruning technique, where edges that are available for pruning are dashed. The second
pruning technique considers any vertex that sits in between vertices that have degree exactly 2. For all such vertices
found, edges not connected to its degree 2 neighbors can be pruned from this middle vertex as a Hamiltonian cycle must
pass through this vertex using its degree 2 neighbors. Figure \ref{fig:prune_mid} shows an example of this pruning technique, where
$u$ and $v$ are both degree 2 vertices, with all of $w$'s edges able to be pruned save the ones connected to $u$ and $v$.
%\begin{algorithm}
%\caption{ PruneGraph }
%\label{prunegraph}
%\SetKwFunction{prunegraph}{prunegraph}
%\KwIn{ Graph G }
%\KwOut{ Graph G' }
%
%stillPruning = true \;
%\While{ stillPruning }{
% twov = all vertices of degree 2 \;
% \ForEach{ $u$ in twov }{
% $(v_0, v_1) = Neighbors(u)$ \;
% \For{ $i \rightarrow {0,1} $ }{
% \ForEach{ $w \in Neighbors(v_i) / u $ }{
% \If{ $deg(w) \text{ != } 2$ } {
% continue \;
% }
% \ForEach{ $x \in Neighbors(v_i)/\{u, w\} $ }{
% $E = E/(x, v_i)$ \;
% \If{ $deg(v_i) < 2$ }{ \Return $(\ )$ \; }
% \If{ $deg(v_i) == 2$ }{ $twov = twov \cup v_i $ \; }
% \If{ $deg(x) < 2$ }{ \Return $(\ )$ \; }
% \If{ $deg(x) == 2$ }{ $twov = twov \cup x $ \; }
% }
% }
% }
% }
%
% $(stillPruning, G) = PruneChain(G)$ \;
% \If{ $G == $ null }{ \Return $(\ )$ \; }
%}
%\Return G \;
%
%\end{algorithm}
%
%\begin{algorithm}
%\caption{ PruneChain }
%\label{prunechain}
%\SetKwFunction{prunechain}{prunechain}
%\KwIn{ Graph G }
%\KwOut{ boolean prunedEdge, Graph G' }
%$(V, E) = G$ \;
%$marked[|V|] = ()$ \;
%twov = all vertices of degree 2 \;
%\ForEach{ $v \in twov$ } {
% \If{ $marked[v]$ } { next \; }
% $(v_0, v_1) = Neighbor(v)$ \;
% $marked[v] = 1$ \;
% \For{ $i \in {0,1}$ }{
% $v_{prev} = v$ \;
% \While{ $deg(v_i) == 2$ }{
% \If{ $v_i$ == $v$ }{ \Return $($false, $G)$ \; }
% $marked[v_i] = 1$ \;
% $ t = v_i $ \;
% $v_i = Neighbor(v_i) / v_{prev} $ \;
% $v_{prev} = t$ \;
% }
% }
% \If{ not $v_0 \sim v_1$ }{ next \; }
% $E = E / (v_0, v_1) $ \;
% \If{ $deg(v_0) < 2$ or $deg(v_1) < 2$}{ \Return $($undef, null$)$ \; }
% \Return $($true$, G)$ \;
%}
%\Return $($false$, G)$ \;
%\end{algorithm}
The last pruning technique is to consider any chain
of degree 2 vertices less than the total number of vertices in the graph.
If there is an edge connecting the endpoints
of this chain, it can be pruned as any path using this edge would close off a loop and preclude a Hamiltonian cycle from
occurring. Figure \ref{fig:prune_chain} shows an example of this.
%Algorithms \ref{prunegraph} and \ref{prunechain} gives pseudo-code for these pruning techniques.
%\begin{algorithm}
%\caption{ CutSetHeuristic }
%\label{cutset}
%\KwIn{ Graph G }
%\KwOut{ Boolean noham }
%$(V, E) = G$ \;
%$k = 1$ \;
%\ForEach{ $v \in V$, maximum degree first }{
% remove $v$ from $G$ \;
% $c = NumComponents(G)$ \;
% \If{ $c > k$}{ \Return false \; }
% $k = k + 1$ \;
%}
%\Return true \;
%\end{algorithm}
Any graph that can be partitioned into $k+1$ or more components by the removal of $k$ vertices cannot have a Hamiltonian cycle in it.
This can be seen by a pigeon hole argument as any Hamiltonian cycle must visit each of the $k+1$ or more components with only
$k$ vertices to connect them. See Bondy and Murty \citeyear{bondy} for a proof.
The converse is not true as graphs exist for which there is no Hamiltonian cycle yet no choice of $k$
vertices to remove will partition the graph into $k+1$ or more components. Finding a set of $k$ vertices that partition
the graph into $k+1$ or more components provides a certificate of non-Hamiltonicity, whereas the inability to find
such a cut-set gives no guarantee as to whether a Hamiltonian cycle exists in the graph.
Using this idea, a cut-set heuristic is added to Culberson and Vandegriend's algorithm and is used as a test
to find graphs for which no Hamiltonian cycle can occur.
Graphs are initially filtered through the cut-set heuristic, once at the start of the search, before the back tracking algorithm commences.
The cut set heuristic works as follows: The cut-set heuristic algorithm proceeds in
$n$ iterations. At iteration $k$,
the vertex with the largest degree is taken out and the graph is checked
to see if there are greater than $k$ connected components. If, during this removal process, the number of connected components, $c$, is
ever greater than $k$, we know that no Hamiltonian cycle can exist.
One can choose any schedule of vertex removal
for this heuristic. The maximum degree first is chosen
as it has been
%the author's observation
observed by the authors
that choosing large degree vertices first for this removal works best, most
likely due to the larger connectivity of the high degree vertices and thus a higher likelihood of separating
the graph into more components.
%Algorithm \ref{cutset} gives pseudo-code for this heuristic.
A schedule of lowest degree first is used when traversing. If the nodes searched reaches a threshold, the algorithm
is restarted with another vertex, chosen at random, and the threshold doubled. In this way bad initial choices are
mitigated against and the threshold will eventually be large enough to traverse the whole search space.
%\begin{algorithm}
%\caption{ FindHamCycle }
%\label{findham}
%\KwIn{ Graph G, path p, int l, int MaxIter, int CurIter }
%\KwOut{ int Iter, Boolean found, path p }
%CurIter = CurIter + 1 \;
%\If{ CurIter $\ge$ MaxIter }{ \Return ( CurIter, false, (\ ) ) \; }
%$( V, E ) = G$ \;
%\If{ $|p| = |V|$ }{
% \If{ $p_0 \sim p_{n-1}$ }{
% \Return $(CurIter, true, p)$ \;
% }
% \Else{ \Return $(CurIter, false, ())$ \; }
%}
%$G' = PruneGraph(G)$ \;
%$vchoice = DegSortAsc(Neighbors(p_l) / p_{l-1} )$ \;
%\ForEach{ $v \in $ vchoice }{
% $p_{l+1} = v$ \;
% $G'' = PrunePath(G', path)$ \;
% $(CurIter, r, pp) = FindHamCycle(G'', p, l+1, MaxIter, CurIter)$ \;
% \If{ r }{ \Return $(CurIter, true, pp)$ \; }
%}
%\Return $(CurIter, false, ())$ \;
%\end{algorithm}
%
%\begin{algorithm}
%\caption{ FindHamCycleComplete }
%\label{findhamcomplete}
%\KwIn{ Graph G, path p, length l }
%\KwOut{ int Iter, Boolean found, path p }
%$(V, E) = G$ \;
%\If{ CutSetHeuristic( $G$ ) is false }{ \Return $( Iter, false, () )$ \; }
%\For{ $v \in V$ }{
% $MaxIter = 2 |V|$ \;
% ( Iter, found, p ) = FindHam( G, p, 0, $2 |V|$, 0 ) \;
% \If{ found }{ \Return ( Iter, true, p ) \; }
%}
%found = false \;
%\While{ not found }{
% ( Iter, found, p ) = FindHam( G, p, 0, MaxIter, 0) \;
% \If{ found or (not found and Iter $<$ MaxIter ) }{
% \Return ( Iter, found, p ) \;
% }
% MaxIter = 2 MaxIter \;
%}
%
%\end{algorithm}
%
%\begin{algorithm}
%\caption{PrunePath}
%\KwIn{Graph G = (V, E), path p}
%\KwOut{Graph G'}
%\ForEach{ $i \in [1, \dots, |p|-2] $ }{
% \ForEach{ $w \in Neighbors(p_i)/ \{ p_{i-1}, p_{i+1} \}$ }{
% $E = E / (p_i, w)$ \;
% }
%}
%return $(V, E)$ \;
%\end{algorithm}
A bad initial pick of starting vertex could lead to an unnecessary increase in search cost
were a better vertex initially picked. See Culberson and Vandegriend \citeyear{vandegriend} for
details about observed inflated run-times when searching for Hamiltonian cycles in Erd\"os-R\'enyi random graphs.
As an extra precaution
against bad initial vertex choice, the algorithm is initially run without the exponential
threshold restart, setting the threshold to $2n$ and running the algorithm, using each vertex in the graph as the starting
vertex.
Only after this initial check is run on all vertices is the
complete algorithm
run to determine Hamiltonicity.
%Algorithm \ref{findham} gives the pseudo-code for the recursive algorithm to find a Hamiltonian cycle and
%Algorithm \ref{findhamcomplete} gives the pseudo-code for the complete algorithm.
With all these heuristics combined, finding Hamiltonian cycles in Erd\"os-R\'enyi graphs becomes much easier.
The authors were unable to find
any instances of Erd\"os-R\'enyi graphs that took more than a small constant factor of $n$ in
nodes traversed to determine Hamiltonicity. This suggests that, while the pruning techniques and checks for
cut-sets are only heuristics that, in general,
do not give an exponential gain in algorithmic speed, they are well suited to search for Hamiltonian cycles
in Erd\"os-R\'enyi graphs.
Culberson and Vandegriend's algorithm did not include the cut-set heuristic or the initial small threshold check for
Hamiltonian cycles. The initial small threshold check was added after noticing that nearly all Erd\"os-R\'enyi
random graphs were solved by a good initial vertex choice. The cut-set heurstic check was added after noticing
that many L\'evy-Stable degree distributed random graphs in the impossible region were easily verified to
have no Hamiltonian cycle with this heuristic in place.
%Without the cut-set heuristic, many L\'evy-Stable
%degree distributed random graphs are not easily found to have no Hamiltonian cycle with just Culberson and Vandegriend's
%algorithm.
With the cut-set heuristic, many L\'evy-Stable degree distributed random graphs
are easily verified to be non-Hamiltonian that would be difficult for the vanilla Culberson
and Vandegriend algorithm.
\subsection{P\'osa Heuristic}
\begin{figure}
\centering
\includegraphics[scale=0.35]{graphs/posa_0.ps}
\includegraphics[scale=0.35]{graphs/posa_1.ps}
\caption{ An example of the P\'osa heuristic used to rotate the current path so that extension is still
possible. }
\label{fig:posa_example}
\end{figure}
For larger graphs, a non-complete randomized algorithm is desirable for speed concerns.
The P\'osa heuristic transforms the current incomplete path in an attempt to jostle it into
a state where the path can be further extended. Assume the current path of length $l$ is $ ( w_0 w_1 \dots w_{l-2} w_{l-1} ) $.
%Choose a random $k$ such that $w_k$ and $w_{l-1}$ are connected.
A random $k$ is chosent such that $w_k$ and $w_{l-1}$ are connected.
%Remove the edge $(w_{k} , w_{k+1})$ from the current path
The edge $ ( w_{k} , w_{k+1} ) $ is then removed from the current path
%and add edge $(w_k , w_{l-1})$ to the current path, making a new path $w_0 w_1 \dots w_{k-1} w_{l-1} w_{l-2} \dots w_{k+1} w_k$.
and an edge $ ( w_k , w_{l-1} ) $ is added to the current path, making a new path $ ( w_0 w_1 \dots w_{k-1} w_{l-1} w_{l-2} \dots w_{k+1} w_k ) $.
The hope is that the path from $w_k$ can now be further extended.
If no choice of $k$ is available, the algorithm can be restarted. Figure \ref{fig:posa_example} shows an example of applying
the P\'osa heuristic for a portion of a graph.
%\begin{algorithm}
%\caption{P\'osa}
%\label{posa}
%\KwIn{Graph G, Iter}
%\KwOut{path p}
%$Iter=0$\;
%$visited = \o$ \;
%$p_0$ = random vertex \;
%$l=1$\;
%\Repeat{ $Iter == MaxIter$ or cycle is found }{
% \If(\tcc*[f]{extend path}){ $ |Neighbors(p_{l-1}) / visited| > 0 $}{
% $p_l$ = a random vertex from $Neighbors(p_{l-1}) / visited$ \;
% $visited = visited \cup p_l$ \;
% $l = l + 1$ \;
% continue \;
% }
% \ElseIf{ $l \le 2$ or $deg(p_l)==1$}{ \Return $\o$ \; }
% \Else(\tcc*[f]{rotate}){
% $A = Neighbors(p_{l-1}) \cap ( visited / p_{l-2} ) $ \;
% $w$ = choose a random vertex from $A$ \;
% $p = p_0 p_1 p_2 \dots p_{k-1} p_{l-1} p_{l-2} \dots p_{k+1} w$ \;
% $Iter = Iter + 1$ \;
% }
%}
%\If{$Iter == MaxIter$}{ return $\o$ \; }
%\Else{ return $p$ \; }
%\end{algorithm}
This algorithm does not determine non-Hamiltonicity. For very large graphs, though, this algorithm can be run efficiently
and works reasonably well in practice. The reader is referred to Section \ref{sec:Results} for some comparisons on how
the P\'osa heuristic fares against the modified version of Culberson and Vandegriend's algorithm
for Erd\"os-R\'enyi and L\'evy-Stable degree distributed random graphs.
%Algorithm \ref{posa} provides pseudo-code for the algorithm using the P\'osa heuristic.
\subsection{Graph Implementation Details}
There are two different ways graphs are implemented, depending on whether the complete algorithm or the
P\'osa heuristic algorithm is used.
For small graphs used in the complete algorithm, each vertex has a list of its neighbors in a dynamic array. A ``back-pointer"
matrix is used that doubles as an adjacency matrix. If the $[i][j]$ entry in the
back-pointer matrix, $a$, is non-negative, this denotes there is an edge between vertex $i$ and vertex $j$ and the value of $a[i][j]$ is
the position of vertex $j$ in $i$'s neighbor list. In this way, edges can be inserted and deleted with only a constant number of operations.
This is
desirable for the complete algorithm as edges are being removed and inserted frequently during execution of the algorithm. Arrays are kept for
quick lookup of vertices of degree 2 so that the graph need not be traversed again when these vertices are needed.
For larger graphs, a simple vertex list is used instead, where each vertex has a neighbor
list ordered by vertex name.
The ordering allows for $O(\lg(n))$ adjacency tests. This simple structure is sufficient for the
P\'osa heuristic algorithm and is used for graphs in the range of 100 to 1000 vertices.
%---------------- COMPLETE SMALL
\begin{figure}
\centering
\includegraphics[width=0.34\textwidth,angle=-90]{ham_complete_prob_a0.5.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{ham_complete_it_a0.5.ps}
%\end{figure}
%\begin{figure}
\centering
\includegraphics[width=0.34\textwidth,angle=-90]{ham_complete_prob_a1.0.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{ham_complete_it_a1.0.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{ham_complete_prob_a1.5.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{ham_complete_it_a1.5.ps}
\caption{Probability (left) and nodes searched (right) for $\alpha=0.5$ (top), $\alpha = 1.0$ (middle) and $\alpha = 1.5$ (bottom) compared against the scale parameter, $\gamma$. The $x$ axis in each case is $\gamma$ and the $y$ axis is probability (left) and total nodes traversed (right). Each point represents the average of 200 graphs.
%A modified version of Vandegriend's algorithm
The complete version of Culberson and Vandegriend's algorithm
%(Algorithm {\bf FindHamCycleComplete} from Chapter 3)
was used to find Hamiltonian cycles.
%A complete search algorithm was used with random restart and double technique.
Nodes searched are on a log-scale. }
\label{complete_a0.5}
\label{complete_a1.0}
\label{complete_a1.5}
\end{figure}
\section{Results}
\label{sec:Results}
%\subsection{Small Graph Instances}
%\label{ham_complete_small}
%
%A slightly modified version of Culberson and Vandegriend's algorithm was used %, {\bf FindHamCycleComplete},
%to find Hamiltonian Cycles in small graphs ($N < 100$ vertices) whose degree sequence
%was chosen from a modified L\'evy-Stable distribution. The algorithms used
%in this section are briefly discussed below.
%%The reader is referred to Chapter 3 for further explanation and pseudo-code.
%Culberson and Vandegriend's original
%algorithm can be found in \cite{vandegriend}.
%
%The Erased Configuration Model by Molloy and Reed \cite{molloy} is employed
%for creating a graph from its degree sequence.
%A degree distribution is created where each vertex
%degree is a random value drawn from a discretized random variable whose
%distribution is L\'evy-Stable.
%After the degree sequence is chosen, a
%best effort approach is done to connect vertices, collapsing multiple edges into one
%and removing loops.
%%See Chapter 3, algorithm {\bf GenerateApproxDegSequenceGraph }
%%for pseudo-code for this algorithm.
%
%Once a graph is generated %, %{\bf FindHamCycleComplete}
%Culberson and Vandegriend's algorithm
%is used to try to find a Hamiltonian Cycle. First, a cut
%set heuristic is employed to see if a removal of a subset of the vertices
%would produce enough disjoint components to preclude a Hamiltonian Cycle from
%occurring. Assuming this heuristic is passed, an initial attempt at
%finding a Hamiltonian Cycle is employed starting from each vertex in the graph
%but only running for $2 n$ iterations. Assuming no Hamiltonian Cycle is found
%from this initial pass, a backtracking search is then started, using a schedule
%of least degree first to traverse the search space. A maximum cut off of iterations
%is initially set and if the algorithm has not terminated before hitting
%this maximum, the algorithm is restarted with another initial vertex pick and
%the maximum iteration count doubled.
%
%At each point in the backtracking algorithm, three pruning techniques are employed.
%Firstly, any vertex straddled by vertices of degree 2
%has the rest of its edges removed
%as no Hamiltonian Cycle can use them. Secondly,
%any non complete path of degree 2 vertices
%has the edge connecting the two endpoints removed,
%if it exists, as no Hamiltonian Cycle will be able to use it. Finally,
%edges incident to any of the non-terminal vertices used in the current path
%that are not part of the current path
%are removed.
%
%
%The only modification to Culberson and Vandegriend's algorithm has been the
%initial heuristics of the cut set and initial loop through the vertices. The
%cut set heuristic was added after noticing that, even in the probability 0 region,
%many power law degree distributed graphs needed an exponential number of iterations
%to decide Hamiltonicity. The addition of this heuristic helped mitigate this
%increase in search cost and is the main reason why the search cost is so low
%in the probability 0 region for the graphs considered.
%
%The initial loop through the vertices and test to
%find a Hamiltonian cycle in $2 n$ iterations was employed after noticing that,
%for Erd\"os-R\'enyi random graphs, the occasional increase in node traversal
%disappeared if the graph was initially checked this way. After this heuristic
%was employed, the author found that the occasional polynomial search cost
%noticed by Culberson and Vandegriend disappeared. This heuristic was
%kept for graphs generated from a power law degree distribution as
%it was so effective with Erd\"os-R\'enyi random graphs.
\subsection{Numerical Results for Small Graph Instances}
\label{ham_complete_small}
The slightly modified version of Culberson and Vandegriend's algorithm was used
to find Hamiltonian cycles in small graphs ($n < 100$ vertices) whose degree sequence
was chosen from a modified L\'evy-Stable distribution. For further
detail on their original algorithm,
the reader is referred to Culberson and Vandegriend's paper \citeyear{vandegriend}.
Figure \ref{complete_a1.5} shows the probability of finding a Hamiltonian
cycle and nodes searched for
the
a complete version of Culberson and Vandegriend's algorithm
%{\bf FindHamCycleComplete}
algorithm for $n \in \{ 20, 26, 32, 38 \}$ for $\alpha \in \{0.5, 1.0, 1.5\}$ as
a function of the scale parameter, $\gamma$.
Each point represents 200 simulation runs. Run times are plotted on a semi-log scale.
Each graph was generated by
%a call to {\bf GenerateLevyDegSequence} to generate
generating
a degree sequence where each
entry was drawn from a symmetric, absolute valued and truncated L\'evy-Stable distribution. This degree
sequence was then
%passed to {\bf GenerateApproxDegSequenceGraph}
used
with a minimum degree capped to 2 to generate the
graph.
%Passing a minimum degree into {\bf GenerateApproxDegSequenceGraph} only inflates values in the degree sequence
%to the minimum value provided.
%Since {\bf GenerateApproxDegSequenceGraph}
%is a best effort algorithm, this
%could lead to a graph returned that has a minimum degree less than the minimum degree passed in.
The graph generation method is a best effort approach and the minimum degree of 2 for the degree sequence
does not guarantee a minimum degree 2 in the resulting graph.
For each of the simulation runs in Figure \ref{complete_a1.5}, graphs whose
actual minimum degree was less than 2 were filtered out and not tested for Hamiltonicity. Enough graphs were
generated so as to provide for 200 graphs that met the minimum degree 2 requirement
for each $\alpha$, $\gamma$ and $n$ needed.
\begin{figure}
\centering
%\includegraphics[scale=0.5,angle=-90]{graphs/complete_small_hard_count_a0.5.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{graphs/complete_small_hard_count_a0.5.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{graphs/ham_small_dropped_a0.5.ps}
%\includegraphics[scale=0.5,angle=-90]{graphs/complete_small_hard_count_a1.0.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{graphs/complete_small_hard_count_a1.0.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{graphs/ham_small_dropped_a1.0.ps}
%\includegraphics[scale=0.5,angle=-90]{graphs/complete_small_hard_count_a1.5.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{graphs/complete_small_hard_count_a1.5.ps}
\includegraphics[width=0.34\textwidth,angle=-90]{graphs/ham_small_dropped_a1.5.ps}
\caption{ Number of `hard' instances, i.e. taking more than $n^2$ nodes to solve using the complete Hamiltonicity algorithm.
Graphs are shown for $\alpha= 0.5$ (top), $\alpha=1.0$ (middle), and $\alpha=1.5 (bottom)$.
The left hand plots show results for small $n$. The right hand plots show results for larger $n$, where
search was terminated after $n^2$ nodes, so probability plots cannot be given.