Friday, July 23, 2010

Thoughts for today

Today, then. Seems like the way to go is to conduct a single case study that is embedded as opposed to holistic. This case study will allow me to analyze the nature and practical going on of a functioning relationship between a developer and a uni/college. This will consist of interviews at the top level, surveys at the employee and student level and hopefully access to documents from the educators about student completion, internship placements and so on. I'm not sure if there will be any place for observation; perhaps in aid of the educational part I might be privy to observing meetings between the two groups, though this seems unlikely. I still need to consider whether this single case study of the relationship between the two groups is appropriate or whether a multiple case study of companies and educators separately is a better option. That will become apparent following some deeper thought on exactly how to answer the 'how' question (I think sub questions may be in order or at least a refinement of the 'how' question).

I will still conduct surveys of students and devs (and possibly educators) beyond the case study/s, but I need to find a way to link those quantitative results to the case study/s so one supports the other.

Wednesday, July 14, 2010

A precocious cephalapod invokes Nostradamus and and I'm left wondering what it all means. At least the fucking vuvuzelas have been laid to rest.

Dear Diary. I completely forgot about the World Cup until 9am that morning and was not especially bothered about that. I am as I type fighting the urge to wander off and look at time wasting websites; this leads me to consider how the prevalence of procrastination has increased in line with a rise in time wasting culture. There are entire online networks dedicated to light distractions CASE IN POINT: I just spent 5 minutes looking at Fail Blog. Fuck!

I fear that all this pop culture knowledge and meme reference that I am gradually gathering will only be of use to me when spending time on the very forums and sites that provided the material in the first place. I need structure, yet again. I will try and write a short blog each night summarizing my day, largely from a research perspective and with the occasional sprinkling of faggotry for good measure (can't be too serious about this stuff).

The internet is a dangerous place and plays host to many temptations and yet more distractions; I must fight hard to continue my vitally important work of some sort, probably.

Everything's changed; I'm looking out for my future... something you would never understand you dirty, dirty, dirty Jew diary. Just kidding, just kidding.

Friday, July 2, 2010

Think think think

So, things are progressing on the phd front. Currently immersed in the methodology chapter and extensive discussion has revealed numerous aspects heretofore unexplored or considered.

That is clear is the need to ensure each component of the thesis is catered for. This means that the methodology chapter must outline the methods to be utilized in not only the data collection stage, but also the analysis stage as the following illustrates. An outcome of the research will likely be a series of recommendations about game development education courses in the form of a proposal. It is important to note is that in order for that proposal to be adequately validated, an evaluation of some sort is required. This evaluation requires criteria.

Two things arise from this aspect. One, the extent of the proposal must be determined. Is it merely a set of theoretical recommendations? Is it that plus implementation on some level, or is it a full blown implementation of the proposal (i.e. creating a space, filing it with students and operating it according to the guidelines specified in the proposal)? The last option is unlikely to occur given the time and budgetary constraints. The second is most likely; a purely theoretical proposal with no practical proof of concept will lack credibility. Once this issue has been settled, one must consider the second issue; determining the criteria used to evaluate it. Assuming a level of observation is conducted and supplemented with student feedback by survey or similar, that data needs stringent analysis using criteria that is independent of the research, i.e. the same criteria could be used to evaluate a different kind of course. Some criteria will clearly be GDE specific but will be derived from data acquired from educators and industry whose recommendations and opinions on how to deliver GDE will form a platform from which the implementation of the 'ideal GDE environment' can be observed and critiqued.

That aside, I need to determine research methods other than the case study in order to compare and contrast them. Given this research seems closest to the social sciences, it makes sense to select common approaches from that field of research and cover them enough to justify use of the case study as the primary method. Another issue arises; the case study may be sufficient to encompass the data gathering and analysis stages, but the proposal (and indeed the proposal to create an intermediary body that liaises with industry and education) and evaluation will need a second method. This must be covered in the methods chapter; how to derive from the data not only a proposal for an ideal games curriculum (or at least a series of recommendations) but also some of the criteria with which to evaluate it.

A pressing issue is the creation of a paper on collaborative spaces. The angle should be that given the value of collaboration in game development, is there an ideal method of creating an environment conducive to collaborative and beneficial interaction between GDE students? I'll explore research on creative or group/collaborative spaces and perhaps tie it in to observations that might be made in a classroom environment when group activity is occurring. I'll explore the theory behind collaborative spaces, existing applications, and look at the actual physical implementation of a collaborative game development space. It remains to be seen whether an opportunity will arise to actually test any hypotheses that are formulated.

Anyway, meeting next week to discuss the ins and outs of creating and managing a course and what some of the basic criteria are for evaluating a course's effectiveness. I need to get a handle on some of the basic educational standards that should apply to any course based presumably on extensive research and repeated evaluation. How do you know if the course is successful? How do you measure it? What metrics do you use? Student numbers, assessment styles, results, graduate numbers, employed graduates post-qualification? Hmm hmm hmm.