Şimşek, A. Serdar

Permanent URI for this collectionhttps://hdl.handle.net/10735.1/6863

A. Serdar Şimşek is an Assistant Professor of Operations Management. His research interests include:

  • Empirical Operations Management
  • Pricing and Revenue Management
  • Supply Chain Management

Browse

Recent Submissions

Now showing 1 - 2 of 2
  • Item
    Pricing Problems Under the Markov Chain Choice Model
    (Wiley, 2019-01) Dong, James; Şimşek, A. Serdar; Topaloglu, Huseyin; 0000-0003-2334-5944 (Şimşek, AS); Şimşek, A. Serdar
    We consider pricing problems when customers choose under the Markov chain choice model. In this choice model, a customer arriving into the system is interested in a certain product with a certain probability. Depending on the price charged for this product, the customer decides whether to purchase the product. If the customer purchases the product, then she leaves the system. Otherwise, the customer transitions to another product or to the no purchase option with certain transition probabilities. In this way, the customer transitions between the products until she purchases a product or reaches the no purchase option. We study three fundamental pricing problems under this choice model. First, for the monopolistic pricing problem, we show how to compute the optimal prices efficiently. Second, for the competitive pricing problem, we show that a Nash equilibrium exists, prove that Nash equilibrium prices are no larger than the prices computed by a central planner controlling all prices and characterize a Nash equilibrium that Pareto dominates all other Nash equilibria. Third, for the dynamic pricing problem with a single resource, we show that the optimal prices decrease as we have more resource capacity or as we get closer to the end of the selling horizon. We also consider a deterministic approximation formulated under the assumption that the demand for each product takes on its expected value. Although the objective function and constraints in this approximation do not have explicit expressions, we develop an equivalent reformulation with explicit expressions for the objective function and constraints.
  • Item
    Technical Note - an Expectation-Maximization Algorithm to Estimate the Parameters of the Markov Chain Choice Model
    (INFORMS, Institute.for Operations Research.and The Management Sciences) Şimşek, A. Serdar; Topaloglu, H.; 0000-0003-2334-5944 (Şimşek, AS); Şimşek, A. Serdar
    We develop an expectation-maximization algorithm to estimate the parameters of the Markov chain choice model. In this choice model, a customer arrives into the system to purchase a certain product. If this product is available for purchase, then the customer purchases it. Otherwise, the customer transitions between the products according to a transition probability matrix until she reaches an available one and purchases this product. The parameters of the Markov chain choice model are the probability that the customer arrives into the system to purchase each one of the products and the entries of the transition probability matrix. In our expectation-maximization algorithm, we treat the path that a customer follows in the Markov chain as the missing piece of the data. Conditional on the final purchase decision of a customer, we show how to compute the probability that the customer arrives into the system to purchase a certain product and the expected number of times that the customer transitions from a certain product to another one. These results allow us to execute the expectation step of our algorithm. Also, we show how to solve the optimization problem that appears in the maximization step of our algorithm. Our computational experiments show that the Markov chain choice model, coupled with our expectation-maximization algorithm, can yield better predictions of customer choice behavior when compared with other commonly used alternatives. © 2018 IFORMS.

Works in Treasures @ UT Dallas are made available exclusively for educational purposes such as research or instruction. Literary rights, including copyright for published works held by the creator(s) or their heirs, or other third parties may apply. All rights are reserved unless otherwise indicated by the copyright owner(s).