Jump to content
An Old School Catholic Message Board

Developer Of Hpv Vaccines Comes Clean, Warns Parents: Giant Deadly Sca


ToJesusMyHeart

Recommended Posts

Yes, I'm arguing that "superficial" observation (compared to targeted scientific falsificationism) is alone proof enough for us to come to SOME conclusions. And I argue that:

Could you offer examples?

 

 

observation + logical inference based upon experience = rational conclusion —even when the conclusion is false

I think your math adds up to a rational hypothesis, not conclusion.

 

 

Truth/falsity is not the same thing as rationality. I'm not arguing for the truth of anti-vaccination beliefs here. I'm arguing that they're rational, and so deserve respect.

People have shown them respect. Scientists have looked into nearly every concern and disproved them. 

 

 

 

Actually, people can draw conclusions out of thin air. It just isn't a very rigorous method of arriving at knowledge.

 

Oh I know they can haha, its just not supported in the least.

 

 

 

And I am saying precisely that I do not expect non-scientists to verify the beliefs they have formed from observation. They're not scientists. That's not what they do. What I expect is that scientists will take non-scientists' claims of experience seriously enough that they will investigate them in a rigorous scientific fashion. There's a cooperation between scientific and non-scientific communities that needs to happen in the area of vaccinations. That's my point. It's condescending and irresponsible to ignore a mass of public claims that, say, the flu vaccine is causing people to get sick.

Like I said above, scientists have taken this anti-vac movement VERY seriously; peoples lives are on the line. So go do some personal research and find the papers that (as you requests) scientists have published on the matter. They have done a lot. 

 

 

 

It could be perfectly rational for someone to believe that seeing a black cat causes toe-stubbing. If the correlation happens once, it probably is not, as there's no reason to think the two connected. (It would, in that case, be a false cause fallacy.) But if someone stubs their toe EVERY TIME they see a black cat, it would become increasingly rational for that person to think that seeing a black cat causes toe stubbing. Again, the truth value of a belief is not the same as its rationality. Truth value is determined by correspondence with events in the real world. Rationality is determined by how a belief "hangs together" with one's other beliefs and experiences. Those are very different things.

So you in your opinion water causes cancer?

 

Based off everything you have said you would believe its rational to think water causes cancer since 100% of people diagnosed with cancer drink water. Thats what youre saying?

Edited by CrossCuT
Link to comment
Share on other sites

People have different feelings about religion. because of their diverse experiences.
But in spite the fact that people's different religious beliefs are "understandable" it remains that there is but one true faith.

People have different "feelings" about vaccines. But there is true truth out there about whether they are safe and effective.

Science cannot identify the one true religion, but it is the best (and I would argue only) way to find the truth about this or that vaccine.

Do we want feelings based medicine or science-based medicine?

 

So, with this analogy, are you trying to say that science is infallible as the One True Faith is infallible?

 

There's a difference between knowledge that is revealed by faith and knowledge that is hashed out little by little, experiment by experiment, sometimes getting it wrong, other times getting it right, and over many, many years, putting it altogether to arrive at truth—a little bit of truth that can then be expanded little by little, experiment by experiment...

Link to comment
Share on other sites

The point is that the "folk wisdom" - and it's not even common among most of the "folk" - is largely equivalent to superstition, at least when it comes to this issue. The examples about the black cat and the drinking water are perfect. It's one thing if these vaccines hadn't been studied. They've been studied over and over! More than a million doses have been given over the course of a decade, and the patterns people talk about are just NOT showing up. Is it theoretically possible that in 50 years we'll know the hpv vaccine causes a bad reaction in certain types of people? Sure. Likely? Nope.

 

Meanwhile there have been only 68 vaccine injury claims paid on the hpv vaccines, with 81 pending and 63 dismissed. The disease it helps prevent kills thousands of women in the United States every year, in spite of the article above that claims it's no big deal. You do the math.

 

This is the kind of language I find condescending. The water example is a classic case of a bad inference. To equate that to someone who reasons as follows:

 

Everyone in my family got the flu vaccine.

Everyone in my family got very ill the day after they got the flu vaccine.

A bunch of my friends got the flu vaccine and got sick, too.

Doctors have told me things before that didn't pan out the way they said they would.

I guess the flu vaccine isn't as safe as doctors say it is.

 

...is unfair. The above reasoning is perfectly rational.

 

You look at the larger numbers of claims and cases that never came to court or were dismissed. I look at the smaller numbers and think: "Hmmm... what happened there?"

 

The point that the disease kills thousands of women every year is a good one. That's precisely where I think individual judgment needs to enter on vaccine issue. Rather than running out to get injected with whatever doctors say is good for us now, we ought to stop and consider: "How likely is it that I will get this disease? Is there an epidemic going around? Might that epidemic really kill or maim me? How much of a risk do I care to take here?" Whether to get a vaccine is a very personal issue. So when people say, "I don't need a smallpox vaccine. We haven't had a case of smallpox in the US for over 50 years" or "I don't need a flu vaccine. I'm strong and healthy with no immune deficiencies, so If I get the flu I'll just go to bed for a few days"—when I hear reasoning like that, I think: "Good for you. Make your own choices. As long as you're prepared to deal with the consequences, I support you."

Link to comment
Share on other sites

Posted Yesterday, 08:00 PM

curiousing, on 14 Nov 2013 - 5:50 PM, said:snapback.png

Yes, I'm arguing that "superficial" observation (compared to targeted scientific falsificationism) is alone proof enough for us to come to SOME conclusions. And I argue that:

Could you offer examples?

 

Most of what we claim to know is arrived at by this "superficial" method of observation. When people say we use the "scientific method" every day, what they typically mean is that we use observation and inference (and sometimes even pseudo-testing of hypotheses) every day. You see this sort of comparison all the time in entry-level science textbooks. It helps make science more accessible to beginners. If you want examples, open up any of the Sherlock Holmes stories. His conclusions are never certain, but they are often very likely. Holmes is brilliant for his mastery of the inference to the best explanation. And I think this is what people do most of their lives, in everyday matters and personal decision making, just not as well as Mr. Holmes.

 

Quote

observation + logical inference based upon experience = rational conclusion —even when the conclusion is false

I think your math adds up to a rational hypothesis, not conclusion.

 

I think your insistence that we have this discussion in scientific terms is an attempt to slant the debate table in your favor. I am speaking in everyday language here, wherever possible. By "conclusion" I meant "the statement (in a logical sense) that one derives from two previous statements, one stating an immediate observation, the other an inference informed by past experience". For example:

 

OBSERVATION: It is 9 o'clock on a Thursday and my husband's car is not in the driveway.

EXPERIENCE INFORMING THE INFERENCE FROM OBSERVATION TO CONCLUSION: My husband often goes to the local pub on Thursday nights.

CONCLUSION: My husband is at the pub.

 

Certainly the conclusion is not watertight, but it's very likely, and definitely rational. See also the example I gave to Lilllabettt.

 

Quote

It could be perfectly rational for someone to believe that seeing a black cat causes toe-stubbing. If the correlation happens once, it probably is not, as there's no reason to think the two connected. (It would, in that case, be a false cause fallacy.) But if someone stubs their toe EVERY TIME they see a black cat, it would become increasingly rational for that person to think that seeing a black cat causes toe stubbing. Again, the truth value of a belief is not the same as its rationality. Truth value is determined by correspondence with events in the real world. Rationality is determined by how a belief "hangs together" with one's other beliefs and experiences. Those are very different things.

So you in your opinion water causes cancer?

 

Based off everything you have said you would believe its rational to think water causes cancer since 100% of people diagnosed with cancer drink water. Thats what youre saying?

 

This is possibly the worst twisting of my words I have ever encountered. I argued that, under certain conditions, it may be perfectly rational for a person to come to the conclusion that seeing a black cat causes toe stubbing. I did not say that I believe that seeing a black cat causes toe stubbing. There are two problems, then, with what you here say:

 

1) You leap unjustifiably from my very limited claim to the claim that I believe an absurd statement.

 

2) Your absurd statement is not analogous to the original example. In order for it to be analogous, it would have to be qualified in the following (or similar) fashion: A person lives in a community in which everyone drinks beer, and only beer. Suddenly some of the community members begin drinking water. After several years, many or all of the water-drinkers, but not the beer-drinkers, develop cancer. The individual concludes that water causes cancer. In such a case, this would be a rational conclusion. But it would not likely be correct.

 

 

 

Now, people, I am rather tired of having to single-handedly defend my position against four people ganging up on me. It is also mid-November, and I am an academic. I have three papers to write. So I will now exit this discussion indefinitely. If you would like to continue it, please send me a PM or quote one of my posts saying so, and I will return in December.

Edited by curiousing
Link to comment
Share on other sites

So, with this analogy, are you trying to say that science is infallible as the One True Faith is infallible?

 

There's a difference between knowledge that is revealed by faith and knowledge that is hashed out little by little, experiment by experiment, sometimes getting it wrong, other times getting it right, and over many, many years, putting it altogether to arrive at truth—a little bit of truth that can then be expanded little by little, experiment by experiment...

i think youre missing my point . There are not multiple "truths" out there about the safety and effectiveness of say - the flu vaccine. Only one truth. Truth is not plural. Nor is it impossible for us to know the truth about the flu vaccine. Claiming otherwise is post-modernism. That is the last thing we need in biological science, is post-modernism. Science is the best and only way to cut through the zillion "experiences" and get to the real truth.

And yeah ... the thing is, there are no experiments which point to a trend of vaccine danger or ineffectiveness. Zero. The "truth" about vaccines is not being revealed little by little ... only confirmation after confirmation of what we already know.

Link to comment
Share on other sites

I really hate people that come in late in a contraversial thread and try to play the "voice of reason" as if they were above the earlier back and forth.  That being said, I'm now going to be that guy anyway.

 

First, you can't put all vaccines in a basket.  'Science' that isn't deterministic or theoretical cannot be absolute and it requires lots and lots of data and analysis to get to something that's conclusive and this takes time.  The more complex the system, the more time you need to accurately weight the pros and cons.  I can't think of a more complex sysem than the human body, which has so many variables, so many inputs and outputs, and where effects can take years to be measured.  So you can't discuss older vaccines at the same time as the new ones because because the data (note: data not science) around older vaccines will be much different than for the newer ones. 

 

Second, biological science is not binary or deterministic.  There are tradeoffs (pros & cons) and the results are not always going to be the same, nor are they absolutely predictable (if only because each human body has so many immeasurable variables.)  With vaccines as with medications, all involved are trying to use data to accurately identify and weigh the pros and cons.  

 

Third, especially in the complex systems described above, 'scientific consensus' is not always scientific.  There are many reasons for this, but hubris and the economics/politics of gov't funding are probably most to blame.

 

There are many, many examples of this, but my favorite is the "food pyramid" which was promoted in the 80's that discouraged proteins/fats in favor of carbs.  It also warned that excess protein (with an unbelievably low threshold) would cause liver damage.  They promoted and taught this theory in medical and nursing schools at least through the 90's even though it contradicted of common sense and empirical experience of those in the fitness industry.  That food pyramid came primarily from the government with help from gov't funded researchers.  It was/is wrong, as "science" is now showing.  Many blame the current "obesity epidemic" on this big push to eat carbs in place of protein/fat heavy foods (eggs, meat, whole dairy, etc.)

 

So what's this mean?  It means that older vaccines like measles, mumps, polio, etc. need to be dealt with differently than new vaccines like this Hpv vaccine, like the "flu shot" which changes every year, like I don't know what else because I don't really follow them.

 

It means you should really scratch your head at anyone saying you shouldn't trust the data (data not science) for the these old vaccines.  These vaccines have been around generations, the effect on the viruses in question are objectively measurable, and there's been time to identify adverse effects over years of empirical evidence.  Also, it's pretty hard to argue there hasn't been a net societal benefit.

 

It also means I should practice a healthy skepticism when I hear someone using binary/absolute terms to describe something that's anything but - like talking about "conclusive scientific evidence" for a vaccine less than 10 or 20 years old.  It's the same reason why you get certain medications only with a prescription and under medical supervision.  Could these new vaccines have benefits?  Sure.  Should someone taken them?  In the right circumstances, quite possibly.  Is there room for skepticism?  Sure, after all, that's what scientists should be, skeptical with the goal of more/new/different testing resulting in more data.  So anyone on either side attacking skepticism should be viewed with, well, skepticism.  

Link to comment
Share on other sites

Could these new vaccines have benefits?  Sure.  Should someone taken them?  In the right circumstances, quite possibly.  Is there room for skepticism?  Sure, after all, that's what scientists should be, skeptical with the goal of more/new/different testing resulting in more data.  So anyone on either side attacking skepticism should be viewed with, well, skepticism.

 

 

 

Science should be skeptical. There is no dogma in good science. But the argument really isn't about skepticism in science.  The argument is about people being skeptical of science.

In this thread is the suggestion that "personal experience"/"culture" and the scientific method are two equivalent ways of evaluating claims about vaccines.

I'm saying that's wrong - its silly and it reeks of post-modernism, which is the intellectual rot of the liberal arts.

Link to comment
Share on other sites

Lilllabettt,  I love reading your posts, so I was hesitant when I found myself disagreeing with you a bit.  I agree that you were sort of arguing about epistimology (way above my paygrade btw), but I reviewed the thread as much as I could prior to posting and y'all were also sort of arguing about the vaccines and when to believe, not to believe "science." 

 

So while I agree with you that anecdotal/personal experience is not equivalent to the scientific method - I'm saying that in some cases scientists aren't skeptical and aren't even following the scientific method and individuals are therefore right to be skeptical of certain scientists and their claims - especially for anything new (not time for sufficient long-run data) and anything govt funded (perverse research incentives.)

Link to comment
Share on other sites

i think youre missing my point . There are not multiple "truths" out there about the safety and effectiveness of say - the flu vaccine. Only one truth. Truth is not plural. Nor is it impossible for us to know the truth about the flu vaccine. Claiming otherwise is post-modernism. That is the last thing we need in biological science, is post-modernism. Science is the best and only way to cut through the zillion "experiences" and get to the real truth.

And yeah ... the thing is, there are no experiments which point to a trend of vaccine danger or ineffectiveness. Zero. The "truth" about vaccines is not being revealed little by little ... only confirmation after confirmation of what we already know.

 

I am a staunch postpositivist, so I find it rather shocking (and all of my colleagues would laugh their asses off) that anyone would ever accuse me of pomo tendencies.

 

Please, show me where I made a single claim that could be considered postmodernist. Not some claim you interpreted as postmodernist, or where you read postmodernist implications into it that aren't there. But an explicit postmodernist claim.

Link to comment
Share on other sites

This is the kind of language I find condescending. The water example is a classic case of a bad inference. To equate that to someone who reasons as follows:

 

Everyone in my family got the flu vaccine.

Everyone in my family got very ill the day after they got the flu vaccine.

A bunch of my friends got the flu vaccine and got sick, too.

Doctors have told me things before that didn't pan out the way they said they would.

I guess the flu vaccine isn't as safe as doctors say it is.

 

...is unfair. The above reasoning is perfectly rational.

 

 

But you see, the above reasoning is no more rational than the water example. Now most of the people who feel this way aren't stupid, they wouldn't fall for an extreme example like water-causes-cancer. But the logic they use is exactly the same. All my friends  got sick after a flu shot, plus my doctor isn't always right about everything, therefore he must not be right about this and the flu shot is what caused the sickness. There is a massive leap there that is pure observational bias. Just because everyone who does X gets illness Z does not mean that X caused Z. Especially if you have done massive multi-year studies that don't show a connection between X and Z.

 

It's VERY common for humans to think that way though - X happened around the same time as Z, therefore they are related. Scientists who study epidemiology have to be very careful of these red herrings. Something in the way humans evolved has caused this instinct to be very powerful, no matter how many mistakes it causes us to make. It's not rational... it's instinctual.

Link to comment
Share on other sites

But you see, the above reasoning is no more rational than the water example. Now most of the people who feel this way aren't stupid, they wouldn't fall for an extreme example like water-causes-cancer. But the logic they use is exactly the same. All my friends  got sick after a flu shot, plus my doctor isn't always right about everything, therefore he must not be right about this and the flu shot is what caused the sickness. There is a massive leap there that is pure observational bias. Just because everyone who does X gets illness Z does not mean that X caused Z. Especially if you have done massive multi-year studies that don't show a connection between X and Z.

 

It's VERY common for humans to think that way though - X happened around the same time as Z, therefore they are related. Scientists who study epidemiology have to be very careful of these red herrings. Something in the way humans evolved has caused this instinct to be very powerful, no matter how many mistakes it causes us to make. It's not rational... it's instinctual.

 

I have finished one paper and am on a break, and your question actually speaks to matters that I discussed in my paper, so...

 

You keep hammering on the correlation ≠ causation point, but everybody participating in this discussion is fully aware of that, and we are already way beyond that.

 

You seem to have a very simplistic (and erroneous) understanding of what is necessary for rationality. For one, you seem to think that logical validity is necessary, but it is not. Logical validity of an argument is necessary for the truth of a conclusion, but not for the reasonableness of one. For two, you seem to think that logical validity is the ONLY thing necessary for rationality. If it is not even necessary at all, how can it be the only thing necessary?

 

What you appear to be missing is likewise two-fold:

 

(1) Truth and rationality are not the same thing. A belief can be perfectly rational without being true. (And, if you want to get into religion, a belief can also be perfectly true without being rational!)

 

(2) Conditions for rationality are many, complex, and highly situation-dependent. For example, there's a professor in my department who is a decision-making scholar (i.e., an expert on rationality). He is fond of introducing the complexity of rationality with the following story:

 

A man retired from his job and received his entire severance package in one lump sum. On the evening of the day he received the money, he went to the local casino, sat down at the roulette table, and placed his entire severance package on 18. Is what he did rational?

 

Undergrads will immediately shout, "No!" So my colleague then complicates matters: "But he won. Now is it rational?" Some of the undergrads will change their minds, but not all. My colleague keeps his mouth shut, and usually there is then at least one really smart undergrad who pipes up with, "Did he know in advance that he was going to win?" Then my colleague grins his big German grin, because now the discussion can start to get truly interesting.

 

Whether or not the man knew in advance that he would win is an important consideration in determining whether his behavior was rational. Clearly, if he did, then it was rational. But if he didn't, it still wasn't necessarily irrational. There are many more contextual factors we have to consider: Is the man independently wealthy, such that his severance package is but a pittance to him? Does the man know that he will die tomorrow, and want nothing more than to play one last game of roulette? Was the man warned by an "insider" that, if he does not spend all of the money by night's end, the government—whom he hates with a passion—will come and take the lot? Any of these contextual conditions could change our judgment of whether the man's behavior was rational.

 

The point is: Rationality is not determined by truth or any sort of correspondence with reality. It's determined by the "fit" of a behavior/belief with all the other beliefs a person holds, all the other things they know, etc. This is why, if a person does not have medical knowledge, but does have a good deal of mistrust of the medical establishment, their refusal to be vaccinated is perfectly rational: They don't have any "vaccines save lives" beliefs for that behavior to conflict with. Similarly, if a person holds the belief that vaccines are dangerous, it would be IRRATIONAL for that person to go and get a vaccine, but RATIONAL for that person to avoid vaccines. It matters not one bit for the rationality of an individual person's vaccine decisions what the medical establishment says about vaccines if that person does not know what the medical establishment says about vaccines or does not believe what it says about vaccines. In other words: The absence of certain knowledge/beliefs can be as critical to the judgment of whether a person's behavior is rational as is the presence of other knowledge/beliefs.

 

This is also why my modified black cat/toe stubbing, modified "water causes cancer but beer does not", and "all my family members got sick after being vaccinated" examples all demonstrate perfectly rational thinking. (Remember that rationality and logical validity/truth are NOT the same thing.) On the other hand, the original black cat/toe stubbing example (the "one-off" case) and the original totally context-less "water causes cancer" example do not illustrate rational thinking—at least not for the average American who knows enough logic (implicitly, not consciously) to know that that reasoning is bad. It is extremely difficult to imagine a person who is so thoroughly deficient in logical ability that they would believe these examples (although, as you say, people commit errors of the same logical form all the time in reference to more complex cases). If we could imagine such a person, though, someone who has zero understanding of the correlation ≠ causation principle, of the false cause fallacy, etc., then, given the severely limited (and probably warped) system of that person's beliefs, it may nonetheless be rational for them to believe those examples, simply because (1) they cannot possibly know any better, and (2) believing them does not in any way conflict with what they do know.

 

Before Lilllabettt accuses me of more pomo nonsense, let me just say explicitly again: TRUTH AND RATIONALITY ARE DIFFERENT THINGS. I am talking about rationality here, not truth.

 

I suggest, Maggie, that you read the following encyclopedia article in its entirety: http://plato.stanford.edu/archives/win2012/entries/epistemology/#JTB

 

...of which I will post the most relevant portion here:

 

 

 

1. What is Knowledge? 1.1 Knowledge as Justified True Belief

There are various kinds of knowledge: knowing how to do something (for example, how to ride a bicycle), knowing someone in person, and knowing a place or a city. Although such knowledge is of epistemological interest as well, we shall focus on knowledge ofpropositions and refer to such knowledge using the schema ‘S knows that p’, where ‘S’ stands for the subject who has knowledge and ‘p’ for the proposition that is known.[1] Our question will be: What are the necessary and sufficient conditions for S to know that p? We may distinguish, broadly, between a traditional and a non-traditional approach to answering this question. We shall refer to them as ‘TK’ and ‘NTK’.

 

According to TK, knowledge that p is, at least approximately, justified true belief (JTB). False propositions cannot be known. Therefore, knowledge requires truth. A proposition doesn't even believe can't be a proposition that S knows. Therefore, knowledge requires belief. Finally, S's being correct in believing that p might merely be a matter of luck.[2]Therefore, knowledge requires a third element, traditionally identified as justification. Thus we arrive at a tripartite analysis of knowledge as JTB: S knows that p if and only if p is true and S is justified in believing that p. According to this analysis, the three conditions — truth, belief, and justification — are individually necessary and jointly sufficient for knowledge.[3]

 

Initially, we may say that the role of justification is to ensure that S's belief is not true merely because of luck. On that, TK and NTK are in agreement. They diverge, however, as soon as we proceed to be more specific about exactly how justification is to fulfill this role. According to TK, S's belief that p is true not merely because of luck when it is reasonable or rational, from S's own point of view, to take p to be true. According to evidentialism, what makes a belief justified in this sense is the possession of evidence. The basic idea is that a belief is justified to the degree it fits S's evidence. NTK, on the other hand, conceives of the role of justification differently. Its job is to ensure that S's belief has a high objective probability of truth and therefore, if true, is not true merely because of luck. One prominent idea is that this is accomplished if, and only if, a belief originates in reliable cognitive processes or faculties. This view is known as reliabilism.[4]

 

Note that:

 

(1) This entry clearly distinguishes between truth and justification/rationality as two completely separate things.

 

(2) In the bold/underlined portions, the talk of criteria of justification is equivalent to our discussion of criteria of rationality.

 

(3) On the evidentialist view, it suffices for rationality that a person's belief fits THEIR evidence, i.e., the evidence they DO have ("my family members all got sick after vaccinations", "doctors have been wrong in the past", etc.), NOT the evidence they DON'T have ("vaccinations have been found safe over and over again in clinical trials", "these trials were mostly reliable", etc.).

 

(4) On the reliabilist view, it is not the "fit" of a belief with evidence held that matters so much as it is the process or procedure by which the person arrived at the belief. This view is much more complex than the evidentialist one, and there are many different opinions about which processes/procedures qualify as sufficiently reliable for justification.

 

In my opinion, the evidentialist view sets a lower bar for rationality than the reliabilist view, and all I expect from common people (i.e., non-scientists, non-academics, the under-educated, etc.) is that they meet that lower standard. I think that's all one can fairly expect, especially given the state of our educational system.

 

You all are arguing that we ought to hold common people up to the reliabilist standard, which is, in my opinion, much higher. Indeed, it is the standard of science and of most of analytic philosophy. The reliable procedure you seem to require ALL people to follow regardless of educational level is alternately the scientific method or logical deduction. (Or blind faith in science, which for some reason you seem to think perfectly rational. That part I don't understand... Or rather, I do: It's called the "rationalist bias" and it is rampant in America.)

 

While I absolutely expect scientists and other academics and highly educated folk to form most (though not all) of their beliefs to a reliabilist standard, I do not think it fair or even reasonable to expect common people to do so. To consider common people irrational or unreasonable because they meet an evidentialist standard of justification but not a reliabilist one is, in my opinion, unfair, arrogant, and elitist.

 

I hope my view is much more clear now.

 

 

 

 

Personally, I would not be averse to this thread descending into pictures of unicorns at this point.

Edited by curiousing
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...