We present a series of results regarding conceptually simple algorithms for bipartite matching in various online and related models. We first consider a deterministic adversarial model. The best approximation ratio possible in this model is $1/ 2$, which is achieved by any greedy algorithm. Durr et al. presented a $2$-pass algorithm called Category-Advice with approximation ratio $3/ 5$. We extend their algorithm to multiple passes. We prove the exact approximation ratio for the *$k$-pass Category Advice* algorithm for all $k \ge 1$, and show that the approximation ratio converges to the inverse of the golden ratio $2/(1+\sqrt{5}) \approx 0.618$ as $k$ goes to infinity. The convergence is extremely fast—the $5$-pass Category-Advice algorithm is already within $0.01\%$ of the inverse of the golden ratio. We then consider a natural adaptation of a well-known offline *MinGreedy* algorithm to the online stochastic IID model, which we call *MinDegree*. This algorithm is an online version of a well-known and extensively studied offline algorithm MinGreedy. In spite of excellent empirical performance of *MinGreedy*, it was recently shown to have approximation ratio $1/ 2$ in the adversarial offline setting—the approximation ratio achieved by any greedy algorithm. Our result in the online known IID model is, in spirit, similar to the offline result, but the proof is different. We show that *MinDegree* cannot achieve an approximation ratio better than $1-1/e$, which is guaranteed by any consistent greedy algorithm in the known IID model. Finally, following the work in Besser and Poloczek, we depart from an adversarial or stochastic ordering and investigate a natural randomized algorithm *MinRanking* in the priority model. Although the priority model allows the algorithm to choose the input ordering in a general but well defined way, this natural algorithm cannot obtain the approximation of the *Ranking* algorithm in the ROM model.

Type

Publication

In *Theory of Computing Systems*.

Date

July, 2019