Player on First! Can a Markov Model Get Him to Second?
Player on First! Can a Markov Model Get Him to Second?
Henry R. Black, MD: Hi. I'm Dr. Henry Black, Clinical Professor of Internal Medicine at the New York University School of Medicine [New York, NY] and immediate past President of the American Society of Hypertension. I'm here with my friend and colleague Dr. Andrew Vickers from Sloan-Kettering talking about biostatistics in the 21st century. We are way beyond chi-square now. To understand what to do, we need help from our biostatistical colleagues.
Andrew J. Vickers, DPhil: I'm glad to be here.
Dr. Black: Let me ask you about 2 things: Markov models and Monte Carlo simulations. I have become involved with these 2 things, and I'm not sure what I did. What's a Markov model?
Dr. Vickers: The Markov model comes from decision analysis and cost-effectiveness analysis. The way to think about it is a classic decision and analytic problem: Should you give a patient an old drug or a new drug? The new drug gives them a better chance of a quick cure, but it's more expensive or has a side effect.
So you could say, "What could happen?" The patient is either going to be cured early or cured late. There is a probability of cure with the new drug or a probability of cure with the old drug. You can put a value on the importance of an early cure and so on.
We can come back to whether that's a good value. The Markov model affects the fact that life is rarely like this. You don't remain in one state. For example, in my own field of prostate cancer, first, you don't have cancer. Then you develop cancer, and you're treated. Then you might have a recurrence, and then you get what's called a "hormone-refractory cancer." Then you have metastases, and then you die. So you're in these various states. They are not permanent. You're cured either early or late.
The best way to explain a Markov model is actually with baseball. You have a player on first base and a pitch is thrown. That runner is either going to stay on first base or advance to second base, or it's the end of the inning. Someone will pitch and get to first. Then the batter swings at the second pitch, so the runner can move to second. If the batter swings and misses, the runner stays on first. You could say you're in the state of being on first base, and with every pitch that is thrown, you have a probability of going to second or a probability that the inning is over.
What you are trying to do with the Markov model is identify all the states of health that the patient could be in. Then you run a very sophisticated statistical model. You say, "Okay, 1 day has just gone by. What's the probability that this patient's state of health has changed?" Either the patient's state of health has changed or it hasn't.
Monte Carlo is a place where there are lots of casinos -- it's luck and chance, and so forth. The way that these models actually run is that you create this cohort of simulated patients and simulate what happens to them by chance. For example, if a man is treated surgically for prostate cancer, he has a 20% chance of a cancer recurrence. Out of our cohort, we'll randomly select 20% of the men and say that they had a recurrence, using random numbers from the computer.
You run that many, many times and take the average of all those results. You can say, "Okay, these are the states that people would have passed through, and this is how long they would have been in each state given various treatment options." We can then come to some conclusion.
Dr. Black: When you are all done with that, how much can we trust the research that says: Because of this model, we should follow this path or that path? We did something back in 2000 with a statistician about aggressive criteria for diabetes compared with less aggressive criteria. We took 4 recommendations for the goal from the Joint National Committee. We did some estimates of various things -- dialysis, kidney failure, or whatever -- and calculated, using the more strict criteria, that we would actually save money. It's not that the utility would be different, but you would actually save money. With very few things, we do save money.
Dr. Vickers: Exactly.
Dr. Black: It got absolutely no pickup. Nobody seemed to care much about that. That was a Markov model, and I'm proud of it because I wish somebody would pay attention. Now that we have done trials of aggressive therapy in diabetes, in particular, it doesn't quite turn out that way. Aggressive therapy isn't necessarily helpful. One of my concerns is about that and about meta-analyses, which we have talked about.
Dr. Vickers: Is a Markov model a good one?
Dr. Black: Is it good enough?
Dr. Vickers: Yes. The methodology itself is totally sound. There is no problem with the methodology. You can think of it a bit like a black box. You put some stuff in, and then you turn the handle and out comes your result. Of course, what you end up with depends on what you put in, right? The output is very dependent on the input.
In prostate cancer, just recently a Markov model was published in the Journal of the American Medical Association. What do you do with an older, low-risk patient? Should he get treatment or not?
The history was quite interesting because they started by saying that the patients shouldn't get treatment; we assume that the treatment has no effect. So people are saying, "Of course, you shouldn't get treatment if treatment has no effect. That's just obvious."
Then they said, "All right, let's say that treatment has a little bit of an effect." It's still not worth getting treatment. However, if you assume that the treatment has a little bit of effect, then it's worth getting treatment. So the outputs are very dependent on the inputs.
Dr. Black: The media would say "don't treat" or "do treat," and the poor patient and doctor would have to interpret that.
Dr. Vickers: You have to worry about this. When you see 1 of these decision analyses with a Markov model, whether they are simple or more complex, you have to understand what criteria were used. What did they put into the model?
Dr. Black: Dr. Vickers, thank you very much. This is a very interesting topic. Whether we like it or not, as people who practice medicine, we had better understand what's going on or we will have trouble.
Dr. Vickers: Thanks for inviting me.
Henry R. Black, MD: Hi. I'm Dr. Henry Black, Clinical Professor of Internal Medicine at the New York University School of Medicine [New York, NY] and immediate past President of the American Society of Hypertension. I'm here with my friend and colleague Dr. Andrew Vickers from Sloan-Kettering talking about biostatistics in the 21st century. We are way beyond chi-square now. To understand what to do, we need help from our biostatistical colleagues.
Andrew J. Vickers, DPhil: I'm glad to be here.
Dr. Black: Let me ask you about 2 things: Markov models and Monte Carlo simulations. I have become involved with these 2 things, and I'm not sure what I did. What's a Markov model?
Dr. Vickers: The Markov model comes from decision analysis and cost-effectiveness analysis. The way to think about it is a classic decision and analytic problem: Should you give a patient an old drug or a new drug? The new drug gives them a better chance of a quick cure, but it's more expensive or has a side effect.
So you could say, "What could happen?" The patient is either going to be cured early or cured late. There is a probability of cure with the new drug or a probability of cure with the old drug. You can put a value on the importance of an early cure and so on.
We can come back to whether that's a good value. The Markov model affects the fact that life is rarely like this. You don't remain in one state. For example, in my own field of prostate cancer, first, you don't have cancer. Then you develop cancer, and you're treated. Then you might have a recurrence, and then you get what's called a "hormone-refractory cancer." Then you have metastases, and then you die. So you're in these various states. They are not permanent. You're cured either early or late.
The best way to explain a Markov model is actually with baseball. You have a player on first base and a pitch is thrown. That runner is either going to stay on first base or advance to second base, or it's the end of the inning. Someone will pitch and get to first. Then the batter swings at the second pitch, so the runner can move to second. If the batter swings and misses, the runner stays on first. You could say you're in the state of being on first base, and with every pitch that is thrown, you have a probability of going to second or a probability that the inning is over.
What you are trying to do with the Markov model is identify all the states of health that the patient could be in. Then you run a very sophisticated statistical model. You say, "Okay, 1 day has just gone by. What's the probability that this patient's state of health has changed?" Either the patient's state of health has changed or it hasn't.
Monte Carlo is a place where there are lots of casinos -- it's luck and chance, and so forth. The way that these models actually run is that you create this cohort of simulated patients and simulate what happens to them by chance. For example, if a man is treated surgically for prostate cancer, he has a 20% chance of a cancer recurrence. Out of our cohort, we'll randomly select 20% of the men and say that they had a recurrence, using random numbers from the computer.
You run that many, many times and take the average of all those results. You can say, "Okay, these are the states that people would have passed through, and this is how long they would have been in each state given various treatment options." We can then come to some conclusion.
Dr. Black: When you are all done with that, how much can we trust the research that says: Because of this model, we should follow this path or that path? We did something back in 2000 with a statistician about aggressive criteria for diabetes compared with less aggressive criteria. We took 4 recommendations for the goal from the Joint National Committee. We did some estimates of various things -- dialysis, kidney failure, or whatever -- and calculated, using the more strict criteria, that we would actually save money. It's not that the utility would be different, but you would actually save money. With very few things, we do save money.
Dr. Vickers: Exactly.
Dr. Black: It got absolutely no pickup. Nobody seemed to care much about that. That was a Markov model, and I'm proud of it because I wish somebody would pay attention. Now that we have done trials of aggressive therapy in diabetes, in particular, it doesn't quite turn out that way. Aggressive therapy isn't necessarily helpful. One of my concerns is about that and about meta-analyses, which we have talked about.
Dr. Vickers: Is a Markov model a good one?
Dr. Black: Is it good enough?
Dr. Vickers: Yes. The methodology itself is totally sound. There is no problem with the methodology. You can think of it a bit like a black box. You put some stuff in, and then you turn the handle and out comes your result. Of course, what you end up with depends on what you put in, right? The output is very dependent on the input.
In prostate cancer, just recently a Markov model was published in the Journal of the American Medical Association. What do you do with an older, low-risk patient? Should he get treatment or not?
The history was quite interesting because they started by saying that the patients shouldn't get treatment; we assume that the treatment has no effect. So people are saying, "Of course, you shouldn't get treatment if treatment has no effect. That's just obvious."
Then they said, "All right, let's say that treatment has a little bit of an effect." It's still not worth getting treatment. However, if you assume that the treatment has a little bit of effect, then it's worth getting treatment. So the outputs are very dependent on the inputs.
Dr. Black: The media would say "don't treat" or "do treat," and the poor patient and doctor would have to interpret that.
Dr. Vickers: You have to worry about this. When you see 1 of these decision analyses with a Markov model, whether they are simple or more complex, you have to understand what criteria were used. What did they put into the model?
Dr. Black: Dr. Vickers, thank you very much. This is a very interesting topic. Whether we like it or not, as people who practice medicine, we had better understand what's going on or we will have trouble.
Dr. Vickers: Thanks for inviting me.
Source...