Monthly Archives: February 2018

One more database position at Braviant Holdings

We have one more opening! And we want to fill this one ASAP. Here is a position description (and a place to apply): Postgres DBA at Braviant

A person in this position will be reporting to me, helping me to build the state of the art system, using the most cutting-edge technologies, and last but not least – will be able to learn the best practices of the trade from me. The position description is more DBA-like style, and that’s what I need most at the moment, but if a person in this position will show interest in more database development work, the opportunities are endless.

More exciting things to do, than anybody can imagine, but I desperately need at least one more pair of hands!

 

Advertisements

2 Comments

Filed under Companies, Workplace

The second rejected paper

****Reposting because the previous version didn’t get  shared on LinkedIn****

Object-relational impedance mismatch is by far my favorite research topic, mostly due to the fact that it has a very practical implementation. I would make even stronger statement: the most rewarding optimization is the one when you can reduce the number of SQL statements executed when a web page is rendered and all of a sudden it is loaded 50 times faster (and I mean actually 50 times, not figuratively speaking!).  It always looks like a magic – and I haven’d done anything!

This been said, the ORMs are my worst enemies, and I am always looking for opportunities to promote the better ways of communication between a database and an applications. Most of the time the human factor appears to be more important than the technological challenges, so I always think about these projects as battles.

At Braviant however, first time in my professional career I had nobody to fight about this issue – the app developers were completely on board with my approach since day one. Which allowed us to develop something really cool, and to optimize the interaction between databases and application to the point of absolute perfection. SO, when my husband suggested we’d write a short paper about this project, I had no doubt it will be accepted – because two of my previous papers on the same subject were accepted to the very serious conferences.

Life proved me wrong :), I am not going to name the conference and the workshop, but I have to make some comments about the reviews, so that the level of my frustration can be understood.

One of the reviewers asked: why we think that the number of round trips defines the response time of the web application. Another reviewer asked, whether we tried to use Mongo DB :))). And why we think that (de) serialization of the JSON takes negligible time. And why we think Hibernate is worse.

I think the only valid objection was, that the topic of the paper is not relevant to the workshop topic.  And the latter might explain the whole story.

Several years ago, when I started to attend the database conferences again, after fifteen years of absence, I made an observation that a significant number of the attendees never saw the real applications, and never had deal with performance problems, Fortunately, I’ve also met and got to know some really outstanding researches, whom I admire and feel honored to be aquatinted with, so… I am sure I will find the right place to showcase our work.

And may be it’s time to get back to my old “HDAT” workshop idea,,,

And for my fellow Chicagoans: I will be presenting this work this Tuesday, Feb 13 at the Chicago PUG meetup!

Leave a comment

Filed under research

The second rejected paper: the ORIM again

Object-relational impedance mismatch is by far my favorite research topic, mostly due to the fact that it has a very practical implementation. I would make even stronger statement: the most rewarding optimization is the one when you can reduce the number of SQL statements executed when a web page is rendered and all of a sudden it is loaded 50 times faster (and I mean actually 50 times, not figuratively speaking!).  It always looks like a magic – and I haven’d done anything!

This been said, the ORMs are my worst enemies, and I am always looking for opportunities to promote the better ways of communication between a database and an applications. Most of the time the human factor appears to be more important than the technological challenges, so I always think about these projects as battles.

At Braviant however, first time in my professional career I had nobody to fight about this issue – the app developers were completely on board with my approach since day one. Which allowed us to develop something really cool, and to optimize the interaction between databases and application to the point of absolute perfection. SO, when my husband suggested we’d write a short paper about this project, I had no doubt it will be accepted – because two of my previous papers on the same subject were accepted to the very serious conferences.

Life proved me wrong :), I am not going to name the conference and the workshop, but I have to make some comments about the reviews, so that the level of my frustration can be understood.

One of the reviewers asked: why we think that the number of round trips defines the response time of the web application. Another reviewer asked, whether we tried to use Mongo DB :))). And why we think that (de) serialization of the JSON takes negligible time. And why we think Hibernate is worse.

I think the only valid objection was, that the topic of the paper is not relevant to the workshop topic.  And the latter might explain the whole story.

Several years ago, when I started to attend the database conferences again, after fifteen years of absence, I made an observation that a significant number of the attendees never saw the real applications, and never had deal with performance problems, Fortunately, I’ve also met and got to know some really outstanding researches, whom I admire and feel honored to be aquatinted with, so… I am sure I will find the right place to showcase our work.

And may be it’s time to get back to my old “HDAT” workshop idea,,,

And for my fellow Chicagoans: I will be presenting this work this Tuesday, Feb 13 at the Chicago PUG meetup!

8 Comments

Filed under research

Our bitemporal paper was rejected, and how I feel about it

Actually, this winter I had not one, but two papers rejected. And although I never dispute the rejections (it just means I failed to present my work adequately), I wanted to reflect on why both papers were rejected, and what I can do to make them accepted to other conferences.

With our bitemporal paper I was really upset that it didn’t make it to ICDE 2018, because I know that the work itself was magnitudes better than the work, which was accepted for ICDE 2016. Which leaves me with two options: either the topic was not relevant for the Industrial track, or we didn’t present our work well enough, so that it’s novelty would be visible.

I think its’ more that we didn’t explain ourselves well enough. I was trying not to dedicate 1/3 of the paper to  explaining the theory which lays underneath our implementation, and now I think it was a mistake. I didn’t elaborate on the fact, that our second dimension is asserted time, not system time, and what is a semantical difference. So when our our reviewers are saying – “everybody have bitemporal time” – yes, that’s correct, but our two-dimensional time  is different!

I know that the “asserted time” concept is not that easy to grasp when you read about it for the first time, and we didn’t provide any formal definitions. Nor did we provide any formal definitions for the bitemporal operations. It does not matter, that we’ve followed the asserted versioning framework bible… We should have give the formal definitions, and we should have highlighted, that it’s not “bitemporal implementation for Postgres”, but that “we use Postgres to implement the asserted versioning framework, because Postgres has some cool features, which makes it easier”.

Oh, well. There is always a next conference :). Also, I think we should separate this paper into smaller pieces – this one was an attempt to summarize three years of development.

Something to work on! And also – to continue development of the bitemporal library itself.

Leave a comment

Filed under research