Studying cancer as an evolutionary disease. News and reviews about research on cancer and evolution.
Wednesday, December 20, 2006
Quote of the week
They cite this anecdote from Ernest Rutherford. It seems that he found a student working late on an evening and asked him if he also worked in the mornings. The student answered that that is what he usually did and Rutherford then asked "But, when do you think?". Some times I think to myself that my best ideas usually come in the most unexpected places and circumstances. Now I think that I never had any good idea while in the office in front of the computer. Since now I am off for Xmas holidays there is a chance I will have time for some good ideas.
Wednesday, December 13, 2006
Evolution and creationism in Europe
The Nature issue of a couple of weeks ago reports about how creationism is on the increase in Europe (at least the public awareness of what used to be a dormant line of 'thought'). They also have an interview with the leader of a European creationist group (who has a PhD in astrophysics and whose photo in the article seems to be taken by one of his enemies). Nature has also recently published the letter of Maciej Giertych, an MEP and scientist, with a PhD in population genetics, of the Polish Academy of Sciences in which he criticises evolution. The publication of this letter in a journal with the reputation of Nature has generated a considerable amount of controversy as can be seen in the letters send to the editor and published in the last issue.
Monday, December 11, 2006
Anderson et al: Tumor morphology and phenotypic evolution driven by selective pressure from the microenvironment
This is the paper I mentioned in my previous post. It is not that usual to find a mathematical model in a journal like Cell so I hope that this is part of a growing trend.
The paper investigates how the microenvironments helps to drive cancer evolution. To do so they use a hybrid cellular automata model in which cells live in the discrete lattice and the microenvironment (oxygen concentration, extra cellular matrix macromolecule concentration and matrix degrading enzyme) is modeled using continuous variables. The cells are characterised by a number of parameters that determine their behaviour with respect to proliferation, cell-cell adhesion, oxygen consumption, haptotaxis or production of matrix degradation enzymes. Cells follow a life cycle and only proliferate when they reach a certain age. That age depends on the cell's phenotype. During mitosis a cell might alter its phenotype and change the values of proliferation, adhesion, oxygen consumption, etc.
With heterogeneous microenvironment and cell behaviour you get different patterns of tumour growth, some of them favouring agressive invading phenotypes and some of them favouring the coexistance of all sorts of phenotypes. Having the model they described it is possible to study who different microenvironmental factors determine evolution. The results show that harsh environments (little oxygen) select for aggressive phenotypes whereas in milder environments allow for the coexistance of a much bigger range of phenotypes and that these tumours are unlikely to be invasive.
The model is very interesting and the conclusions seem pretty reasonable: Tough microenvironments lead to aggressive tumours. My intuition tells me that on the other hand, heterogeneous populations are more likely to be able to cope with an external aggression which would imply that a less aggressive but more diverse tumour would not respond well to therapies that target any specific kind of behaviour. The main problem with the paper is that the model is fairly complicated for clinical validation.
Tuesday, December 05, 2006
We are big news!
I browse a tech-news website, called slashdot, very often. One of the entries today is the following: "Computer simulations of cancer growth". There they talk about research performed by Sandy Anderson (Dundee, part of the Marie Curie Network in which I am involved), Vito Quaranta (Vanderbilt, I talked about him in my post on the Lyon workshop in late September) and colleagues. They have just got a paper published in Cell of which I will talk about it in a later post.
At any rate, it looks impressive than sites such as Slashdot report on mathematical and computational models of cancer research. The audience of Slashdot is reputed to be very competent in matters of IT but I could see that some of them are medically competent too (I mean, enough to convince a computer scientist like myself, not necessarily more than that), even if, as in many news in Slashdot, people tend to concentrate on what they want to say regardless of the news they are supposed to comment on.
Friday, December 01, 2006
Cancer and development
One of the things many people interested in biology but without a background in biology believe (I hope I am not just describing myself here) is that information goes only in one direction: genes - mRNA - proteins. Actually the opposite is true. Enzymes such as reverse transcriptase can copy can include fragments of RNA into DNA. This is of course a technique used by viruses in order to alter the genetic programme of a cell to produce more copies of the virus. This system is also used to change the genetic programme of a cell during development so if the work of the enzyme is hindered so is development (at least in some crucial steps).
It seems that cancer cells have a lot of reverse transcriptase (this is, unfortunately, not explained in the article) and thus treatments used to prevent viral diseases could be used to hinder tumour growth. In vivo experiments with mice transplanted with human cancer cells show that there is a correlation between tumour growth and the use of HIV treatments that hinder the reverse transcriptase enzymes.
It is one more example of how development and cancer are connected (my take, and I don't claim to be the first one with this insight) is that we would not have cancer if we were not the result of developmental processes.
Tuesday, November 28, 2006
Speakers in Step conference
This Step conference was not meant to be about science per se so the talks were definitely not of a technical nature. James introduced the Physiome project which, as you might know, is about putting together all the current and future knowledge about the human phisiology with
the aim of improving health care. The ideal result would be a giant simulation of the human phisiology that could behave like a real whole organism. Such system would allow physicians and other researchers to test therapies quickly and without nasty side effects
and study 'what if' scenarios.What James thinks we need are:
* Training (No use of sophisticated systems if physicians don't use them)
* Databasing
* Standards (Too many groups out there and no way to compare or integrate their work)
* Modelling archives (I got a nice model, where do I put it for other people to play with?)
* Modelling tools
All in all a nice and light introductory talk. Everything he mentioned is quite reasonable although I am not sure if it is realistic to expect any of these things happening in the short term. People so far seem to be happy happy to come with their own models and not much effort is done to see if the results of one model are consistent with the results of the model of a different group.
Next talk came from Brian Goodwin who, although use to be in the Santa Fe Institute is know a professor of 'holistic science' (which looks quite a scary name for a professorship). The theme of his talk? Computational biology: a clash of cultures. The part of the talk which I found more interesting was when he dealt with the ambiguity of languages. Human languages are ambiguous and the meaning of a sentence gets shaped as we speak. This seems to be a good analogy to understand the language of genes which is also ambiguous (which is nice if you want to evolve it). In his view both human and gene languages have the property that are the best compromise between the effort that the speaker has to make to convey a message and the
effort of the listener. This is an interesting idea although I guess that proving it might be quite complicated (note to my self, should take a look at what has been published about this).
The talk from Denis Noble was also interesting despite the fact that his major point was: I have a new book ("The music of life") go and buy it (which I might do). He made a number of points:
1. There is no gene for function (no objections to that)
2. Transmission of information is not just one way (same here)
3. DNA is not the only transmitter of inheritance (heard that before)
4. Law of relativity in biology: there is no privileged level of causality. Message to Dawkins: the gene is not that important.
5. There is no genetic programme (message to Monod this time).
6. Actually there are no programmes at any level
7. ...and that means not even at the brain level
Thursday, November 23, 2006
Cancer and stem cells
The researchers tried to find out how relevant stem cells are for cancer growth. They show than when performing animal experiments (much more convincing than in vitro), animals with injected colon stem cancer cells are more likely to develop cancer than those in which non-stem cancer cells are used.
It all sounds reasonable to me: one of the capabilities that tumour cells have to acquire for the tumour to become a cancer is limitless replicative potential. If you inject into an animal cells that already have that capability, that should make it easier for the cancer to appear. Also, it is known that some tumour cells, as they mutate, might revert to an undifferentiated state with stem-like behaviour. Therapies that specifically target stem-cell cancer cells should be the next step since stem cells amount to a small proportion of the cells in the body but seem to have such a great potential in cancer initiation.
Monday, November 20, 2006
Evolution on a chip
Normally, when theoretical biologists talk about biology in silico they are thinking of computer models of biology, but this time the in silico referes to silicon chips that have been used to create patched environments, each one representing a different microenvironment (the main difference between the patches being the availability of nutrients). In these patches they placed colonies of E. coli and let them grow. The bacteria were allowed to move from one patch to the next using narrow corridors.
Interestingly but maybe not surprisingly, the bacteria move towards more promising neighbouring patches and some times, adapt, genetically and physiologically to the environment. Asides from some interesting experiments, the guys have been kind enough to produce some mathematical model to study the evolution in silico as well as analysis of what is the evolution of bacterial density in a patch as nutrient availability gets depleted and competition gets tougher.
It is really interesting stuff but it seems that they need to complicate a little bit more the patches in order to get more adaptation to the environment and less motility to the greener grass.
Thursday, November 09, 2006
The Step conference
The Physiome project (or at least what I understood about it after being exposed to the idea for the very first time during this conference) is a highly ambitious project (and that is probably an understatement) whose aim is to integrate all the current and future knowledge about the human physiology. The idea is thus a multiscale modular framework in which all the models about the different parts of the human physiology could be integrated. Such a model would have a tremendous impact on our understanding of physiology, let alone the potential benefits for pharmaceutical companies. For all of you who have any experience doing modeling of biological processes I guess I don't need to tell you how (let's understate it once again) challenging this could be. In any case I am fine with any (extremly) difficult project as long as the intermediate steps are worth something.
In my opinion, the guys in the Step project should aim at something quite modest such as some system by which modelers can integrate just a few models together so different groups can check the consistency of their models and their assumptions. This process will probably take a long while but eventually most modellers will be used to think of their models not in isolation but as something that has to make sense in the context of all the models being developed elsewhere. There should be some infrastructure so the models can be shared between researchers and some protocols and interfaces between models at different scales or across the same scale (say molecular, cellular or tissue) so there can be integration.
One of the speakers mentioned that the keywords in this project are multiscale and modularity. I suggest taking a look at the field of software engineering in which different groups and companies work in different modules and at different levels of abstraction. The software produced is expected to work with other software modules. Of course the complexity to manage is different in the Physiome project but I still think it would be a good starting point.
Thursday, November 02, 2006
Off to Brussels
The website of the conference is here.
Will report back at the end of next week